Apr 20 23:12:32.082752 ip-10-0-131-251 systemd[1]: Starting Kubernetes Kubelet... Apr 20 23:12:32.536887 ip-10-0-131-251 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 23:12:32.536887 ip-10-0-131-251 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 23:12:32.536887 ip-10-0-131-251 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 23:12:32.536887 ip-10-0-131-251 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 23:12:32.536887 ip-10-0-131-251 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 23:12:32.537964 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.537709 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 23:12:32.539947 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539931 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:32.539947 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539945 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539949 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539953 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539956 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539959 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539962 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539964 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539968 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539971 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539973 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539976 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539978 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539981 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539983 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539986 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539988 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539992 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539994 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539997 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.539999 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:32.540007 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540001 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540004 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540006 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540016 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540019 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540022 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540024 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540026 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540029 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540032 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540034 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540036 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540039 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540042 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540044 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540047 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540049 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540051 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540054 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540057 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:32.540491 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540059 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540061 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540064 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540066 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540069 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540071 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540073 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540076 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540078 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540081 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540084 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540086 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540089 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540091 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540094 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540097 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540099 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540102 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540104 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540107 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:32.540987 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540110 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540113 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540115 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540118 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540120 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540123 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540125 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540128 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540130 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540133 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540135 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540137 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540141 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540148 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540151 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540154 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540157 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540160 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540163 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:32.541496 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540174 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540177 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540179 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540182 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540184 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540187 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540560 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540566 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540569 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540572 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540574 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540577 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540579 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540582 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540585 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540587 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540589 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540592 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540595 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540597 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:32.541942 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540600 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540602 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540605 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540608 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540610 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540613 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540615 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540618 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540621 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540623 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540625 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540628 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540633 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540636 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540639 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540643 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540645 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540648 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540650 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:32.542448 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540653 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540656 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540658 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540660 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540664 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540667 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540670 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540672 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540675 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540677 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540679 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540682 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540684 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540687 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540689 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540691 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540694 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540696 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540698 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:32.542923 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540701 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540704 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540706 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540708 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540711 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540713 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540716 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540719 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540721 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540724 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540726 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540729 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540731 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540734 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540737 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540740 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540742 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540745 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540747 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540750 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:32.543413 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540752 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540754 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540757 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540759 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540761 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540764 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540766 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540768 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540771 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540773 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540775 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540778 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540780 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.540782 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540854 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540861 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540867 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540872 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540877 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540881 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540886 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 23:12:32.543933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540890 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540893 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540896 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540899 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540902 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540906 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540908 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540911 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540914 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540917 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540919 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540922 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540927 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540930 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540933 2575 flags.go:64] FLAG: --config-dir="" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540936 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540939 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540943 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540946 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540949 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540952 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540955 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540958 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540961 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540964 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 23:12:32.544434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540967 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540971 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540974 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540977 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540980 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540983 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540986 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540990 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540993 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540996 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.540999 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541002 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541010 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541013 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541015 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541018 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541021 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541024 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541027 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541030 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541033 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541035 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541038 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541042 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541044 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 23:12:32.545055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541047 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541050 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541053 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541056 2575 flags.go:64] FLAG: --help="false" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541059 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541062 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541065 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541067 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541071 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541075 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541078 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541081 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541084 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541086 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541089 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541092 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541095 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541097 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541100 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541103 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541106 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541108 2575 flags.go:64] FLAG: --lock-file="" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541111 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541113 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 23:12:32.545651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541116 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541121 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541124 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541127 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541129 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541132 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541136 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541139 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541142 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541146 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541149 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541153 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541155 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541158 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541161 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541163 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541166 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541169 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541172 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541180 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541183 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541186 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541189 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 23:12:32.546214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541192 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541197 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541199 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541202 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541205 2575 flags.go:64] FLAG: --port="10250" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541208 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541211 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0286cefb566c3ae4b" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541214 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541217 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541219 2575 flags.go:64] FLAG: --register-node="true" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541222 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541225 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541229 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541231 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541234 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541237 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541241 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541244 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541247 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541250 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541252 2575 flags.go:64] FLAG: --runonce="false" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541255 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541258 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541260 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541263 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541266 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 23:12:32.546846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541269 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541272 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541284 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541287 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541290 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541293 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541295 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541298 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541301 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541304 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541309 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541312 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541314 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541318 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541321 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541324 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541326 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541329 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541332 2575 flags.go:64] FLAG: --v="2" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541340 2575 flags.go:64] FLAG: --version="false" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541344 2575 flags.go:64] FLAG: --vmodule="" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541351 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541354 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541439 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:32.547477 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541443 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541446 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541449 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541451 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541454 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541457 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541459 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541476 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541479 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541482 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541485 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541488 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541491 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541493 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541496 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541499 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541501 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541504 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541507 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541509 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541512 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:32.548055 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541514 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541517 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541519 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541522 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541524 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541527 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541529 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541532 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541535 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541538 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541540 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541543 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541545 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541548 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541550 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541553 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541555 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541558 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541560 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:32.548614 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541563 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541565 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541568 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541570 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541574 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541576 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541579 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541581 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541584 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541586 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541589 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541591 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541594 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541596 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541600 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541603 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541606 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541609 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541612 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:32.549080 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541615 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541617 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541621 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541625 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541628 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541631 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541633 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541636 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541638 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541641 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541643 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541645 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541648 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541651 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541653 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541656 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541658 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541661 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541664 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:32.549558 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541666 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541669 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541671 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541674 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541676 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541678 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.541681 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.541690 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.548163 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.548176 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548223 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548227 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548231 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548234 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548237 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548239 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:32.550010 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548242 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548245 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548247 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548250 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548252 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548255 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548257 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548259 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548262 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548265 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548267 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548270 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548273 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548276 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548278 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548281 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548284 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548286 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548289 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548291 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:32.550409 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548294 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548296 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548299 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548301 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548303 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548306 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548309 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548313 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548316 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548319 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548321 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548324 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548327 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548330 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548333 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548335 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548338 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548341 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548343 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:32.550907 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548346 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548349 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548352 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548354 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548357 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548360 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548364 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548367 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548369 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548372 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548375 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548377 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548380 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548382 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548385 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548387 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548389 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548392 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548394 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:32.551358 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548397 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548399 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548401 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548404 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548406 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548409 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548411 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548414 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548416 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548419 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548421 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548424 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548426 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548429 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548431 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548434 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548437 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548439 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548441 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548444 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:32.551855 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548446 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548449 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.548453 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548561 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548565 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548568 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548570 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548573 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548576 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548578 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548580 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548583 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548585 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548588 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548590 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548593 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:32.552375 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548595 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548597 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548600 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548602 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548605 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548607 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548610 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548612 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548615 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548617 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548620 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548622 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548625 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548628 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548630 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548633 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548635 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548637 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548640 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548642 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:32.552773 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548645 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548647 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548650 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548652 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548654 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548657 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548659 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548662 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548664 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548667 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548669 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548672 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548674 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548676 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548679 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548681 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548683 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548686 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548690 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548693 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:32.553240 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548696 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548699 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548701 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548704 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548707 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548710 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548713 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548716 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548718 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548721 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548723 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548726 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548728 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548731 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548733 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548736 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548738 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548741 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548743 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:32.553738 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548745 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548748 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548750 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548752 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548755 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548757 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548759 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548762 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548764 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548767 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548772 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548775 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548777 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:32.548780 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.548785 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 23:12:32.554193 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.549500 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 23:12:32.554587 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.553730 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 23:12:32.554807 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.554795 2575 server.go:1019] "Starting client certificate rotation" Apr 20 23:12:32.554915 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.554898 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 23:12:32.554947 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.554935 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 23:12:32.582387 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.582369 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 23:12:32.588066 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.588043 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 23:12:32.600969 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.600951 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 23:12:32.608518 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.608501 2575 log.go:25] "Validated CRI v1 image API" Apr 20 23:12:32.609682 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.609665 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 23:12:32.610667 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.610651 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 23:12:32.612219 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.612195 2575 fs.go:135] Filesystem UUIDs: map[751b2cf4-36ec-435b-a9e6-5f8861b32f76:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 be2870f0-13b9-4308-bb9f-2ab47e77b109:/dev/nvme0n1p3] Apr 20 23:12:32.612287 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.612218 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 23:12:32.618203 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.618100 2575 manager.go:217] Machine: {Timestamp:2026-04-20 23:12:32.615752006 +0000 UTC m=+0.416885692 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3121215 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec259ff81719ca5c907aa0bf29504f75 SystemUUID:ec259ff8-1719-ca5c-907a-a0bf29504f75 BootID:a1c9d929-acfb-4c8e-b05d-081974fdaaf9 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cf:0f:0e:40:85 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cf:0f:0e:40:85 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:4a:4b:92:65:f3:c1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 23:12:32.618203 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.618202 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 23:12:32.618311 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.618271 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 23:12:32.619892 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.619869 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 23:12:32.620032 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.619895 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-251.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 23:12:32.620080 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.620042 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 23:12:32.620080 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.620052 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 23:12:32.620080 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.620064 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 23:12:32.621072 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.621062 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 23:12:32.622537 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.622527 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 23:12:32.622653 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.622643 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 23:12:32.625895 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.625885 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 23:12:32.625934 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.625905 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 23:12:32.625934 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.625921 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 23:12:32.625934 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.625932 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 23:12:32.626013 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.625944 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 23:12:32.626922 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.626908 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 23:12:32.627108 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.626927 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 23:12:32.632504 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.632488 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 23:12:32.633825 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.633812 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 23:12:32.635659 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635646 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 23:12:32.635719 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635664 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 23:12:32.635719 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635670 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 23:12:32.635719 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635675 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 23:12:32.635719 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635685 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 23:12:32.635719 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635698 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 23:12:32.635719 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635703 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 23:12:32.635719 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635711 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 23:12:32.635719 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635720 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 23:12:32.635935 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635727 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 23:12:32.635935 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635736 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 23:12:32.635935 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635744 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 23:12:32.635935 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.635724 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mz26n" Apr 20 23:12:32.636441 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.636419 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-251.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 23:12:32.636578 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.636559 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 23:12:32.637669 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.637658 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 23:12:32.637669 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.637671 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 23:12:32.640965 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.640951 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 23:12:32.641031 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.640986 2575 server.go:1295] "Started kubelet" Apr 20 23:12:32.641087 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.641063 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 23:12:32.641132 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.641071 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 23:12:32.641177 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.641137 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 23:12:32.641686 ip-10-0-131-251 systemd[1]: Started Kubernetes Kubelet. Apr 20 23:12:32.642383 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.642367 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 23:12:32.643848 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.643832 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 23:12:32.645171 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.645150 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mz26n" Apr 20 23:12:32.648948 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.648930 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-251.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:32.649648 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.649632 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 23:12:32.649735 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.649677 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 23:12:32.649937 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.648950 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-251.ec2.internal.18a833916a191731 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-251.ec2.internal,UID:ip-10-0-131-251.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-251.ec2.internal,},FirstTimestamp:2026-04-20 23:12:32.640964401 +0000 UTC m=+0.442098091,LastTimestamp:2026-04-20 23:12:32.640964401 +0000 UTC m=+0.442098091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-251.ec2.internal,}" Apr 20 23:12:32.650289 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.650274 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 23:12:32.650403 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.650384 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 23:12:32.650403 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.650360 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 23:12:32.650565 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.650273 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 23:12:32.650565 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.650524 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 23:12:32.650565 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.650542 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 23:12:32.650708 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.650565 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:32.650882 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.650867 2575 factory.go:153] Registering CRI-O factory Apr 20 23:12:32.650953 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.650926 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 23:12:32.651001 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.650990 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 23:12:32.651033 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.651000 2575 factory.go:55] Registering systemd factory Apr 20 23:12:32.651033 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.651009 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 23:12:32.651033 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.651032 2575 factory.go:103] Registering Raw factory Apr 20 23:12:32.651139 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.651046 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 23:12:32.651768 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.651746 2575 manager.go:319] Starting recovery of all containers Apr 20 23:12:32.654064 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.653900 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:32.657275 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.657257 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-251.ec2.internal\" not found" node="ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.662956 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.662924 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 23:12:32.663752 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.663729 2575 manager.go:324] Recovery completed Apr 20 23:12:32.669584 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.669570 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:32.672971 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.672953 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:32.673034 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.672982 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:32.673034 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.672992 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:32.673431 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.673417 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 23:12:32.673431 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.673429 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 23:12:32.673523 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.673485 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 23:12:32.676176 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.676164 2575 policy_none.go:49] "None policy: Start" Apr 20 23:12:32.676218 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.676180 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 23:12:32.676218 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.676195 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 23:12:32.709861 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.709846 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 23:12:32.729856 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.709879 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 23:12:32.729856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.709897 2575 server.go:85] "Starting device plugin registration server" Apr 20 23:12:32.729856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.710106 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 23:12:32.729856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.710118 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 23:12:32.729856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.710188 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 23:12:32.729856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.710264 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 23:12:32.729856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.710273 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 23:12:32.729856 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.710943 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 23:12:32.729856 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.710979 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:32.777418 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.777396 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 23:12:32.777538 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.777431 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 23:12:32.777538 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.777494 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 23:12:32.777538 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.777502 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 23:12:32.777538 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.777529 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 23:12:32.779868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.779853 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:32.810881 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.810839 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:32.811730 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.811704 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:32.811816 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.811736 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:32.811816 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.811747 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:32.811816 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.811772 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.820230 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.820209 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.820230 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.820227 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-251.ec2.internal\": node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:32.840735 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.840719 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:32.878057 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.878036 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal"] Apr 20 23:12:32.878118 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.878111 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:32.878832 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.878817 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:32.878918 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.878841 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:32.878918 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.878850 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:32.880420 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.880408 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:32.880583 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.880569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.880632 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.880598 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:32.881065 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.881051 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:32.881146 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.881076 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:32.881146 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.881088 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:32.881146 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.881095 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:32.881146 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.881112 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:32.881146 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.881121 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:32.882708 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.882694 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.882754 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.882720 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:32.883376 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.883361 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:32.883455 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.883403 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:32.883455 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.883419 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:32.896990 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.896976 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-251.ec2.internal\" not found" node="ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.900808 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.900792 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-251.ec2.internal\" not found" node="ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.941097 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:32.941079 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:32.952726 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.952704 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2caee1c055f23a368d6b80867e13d8e0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal\" (UID: \"2caee1c055f23a368d6b80867e13d8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.952801 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.952730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2caee1c055f23a368d6b80867e13d8e0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal\" (UID: \"2caee1c055f23a368d6b80867e13d8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" Apr 20 23:12:32.952801 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:32.952776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0f860586cfb743e565374c69520bb765-config\") pod \"kube-apiserver-proxy-ip-10-0-131-251.ec2.internal\" (UID: \"0f860586cfb743e565374c69520bb765\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal" Apr 20 23:12:33.041298 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.041276 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:33.053708 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.053684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2caee1c055f23a368d6b80867e13d8e0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal\" (UID: \"2caee1c055f23a368d6b80867e13d8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" Apr 20 23:12:33.053755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.053714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0f860586cfb743e565374c69520bb765-config\") pod \"kube-apiserver-proxy-ip-10-0-131-251.ec2.internal\" (UID: \"0f860586cfb743e565374c69520bb765\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal" Apr 20 23:12:33.053755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.053730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2caee1c055f23a368d6b80867e13d8e0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal\" (UID: \"2caee1c055f23a368d6b80867e13d8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" Apr 20 23:12:33.053817 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.053785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2caee1c055f23a368d6b80867e13d8e0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal\" (UID: \"2caee1c055f23a368d6b80867e13d8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" Apr 20 23:12:33.053817 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.053797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0f860586cfb743e565374c69520bb765-config\") pod \"kube-apiserver-proxy-ip-10-0-131-251.ec2.internal\" (UID: \"0f860586cfb743e565374c69520bb765\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal" Apr 20 23:12:33.053880 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.053847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2caee1c055f23a368d6b80867e13d8e0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal\" (UID: \"2caee1c055f23a368d6b80867e13d8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" Apr 20 23:12:33.142109 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.142065 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:33.198604 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.198575 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" Apr 20 23:12:33.203256 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.203237 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal" Apr 20 23:12:33.242921 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.242900 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:33.343383 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.343363 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:33.444079 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.444028 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:33.446303 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.446284 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:33.545013 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.544990 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:33.554391 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.554367 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 23:12:33.554533 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.554514 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 23:12:33.554585 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.554515 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 23:12:33.554585 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.554515 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 23:12:33.646129 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.646101 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:33.648213 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.648187 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 23:07:32 +0000 UTC" deadline="2027-10-04 14:27:07.54213197 +0000 UTC" Apr 20 23:12:33.648275 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.648213 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12759h14m33.893922189s" Apr 20 23:12:33.649759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.649742 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 23:12:33.668514 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.668492 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 23:12:33.685866 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.685848 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h5nlb" Apr 20 23:12:33.694276 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.694229 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h5nlb" Apr 20 23:12:33.746668 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.746645 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:33.790921 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:33.790886 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f860586cfb743e565374c69520bb765.slice/crio-5a7053d1176708e4739b3313c61703e80c831273ef1f12eb9eee6bad20d4ec87 WatchSource:0}: Error finding container 5a7053d1176708e4739b3313c61703e80c831273ef1f12eb9eee6bad20d4ec87: Status 404 returned error can't find the container with id 5a7053d1176708e4739b3313c61703e80c831273ef1f12eb9eee6bad20d4ec87 Apr 20 23:12:33.791272 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:33.791236 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2caee1c055f23a368d6b80867e13d8e0.slice/crio-44ec370828c24932ad0509c7804c15a09b9460fccfc7acf1f10f83a930723681 WatchSource:0}: Error finding container 44ec370828c24932ad0509c7804c15a09b9460fccfc7acf1f10f83a930723681: Status 404 returned error can't find the container with id 44ec370828c24932ad0509c7804c15a09b9460fccfc7acf1f10f83a930723681 Apr 20 23:12:33.795130 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:33.795116 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 23:12:33.847454 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.847433 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:33.947962 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:33.947925 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-251.ec2.internal\" not found" Apr 20 23:12:34.009708 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.009684 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:34.050601 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.050575 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" Apr 20 23:12:34.063019 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.062999 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 23:12:34.064150 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.064136 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal" Apr 20 23:12:34.072115 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.072100 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 23:12:34.627413 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.627381 2575 apiserver.go:52] "Watching apiserver" Apr 20 23:12:34.640213 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.640034 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 23:12:34.640418 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.640382 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-h9hnp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal","openshift-multus/multus-additional-cni-plugins-jncnx","openshift-multus/multus-l4jz9","openshift-network-operator/iptables-alerter-n5czg","kube-system/konnectivity-agent-d5rtn","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7","openshift-cluster-node-tuning-operator/tuned-cqf4v","openshift-dns/node-resolver-w7nck","openshift-multus/network-metrics-daemon-qklww","openshift-network-diagnostics/network-check-target-b9mzc","openshift-ovn-kubernetes/ovnkube-node-lkbxg","kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal"] Apr 20 23:12:34.642264 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.642235 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:34.642376 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:34.642325 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:34.645748 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.645728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.647053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.647028 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.647156 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.647128 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.648314 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.648295 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 23:12:34.648314 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.648303 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 23:12:34.648528 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.648433 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 23:12:34.648604 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.648555 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 23:12:34.648659 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.648642 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qhtjr\"" Apr 20 23:12:34.649081 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.648840 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.649491 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.649456 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jhf8q\"" Apr 20 23:12:34.650053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.649477 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 23:12:34.650053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.649692 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4f2gz\"" Apr 20 23:12:34.650053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.649703 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 23:12:34.650053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.649784 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 23:12:34.650053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.649825 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 23:12:34.650053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.649709 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 23:12:34.650362 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.650352 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.651701 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.651659 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vx8r6\"" Apr 20 23:12:34.653010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.652309 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 23:12:34.653010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.652560 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 23:12:34.653010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.652732 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 23:12:34.653010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.652837 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:12:34.653010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.652810 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 23:12:34.653010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.652889 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rsfgq\"" Apr 20 23:12:34.654434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.654057 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:34.655596 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.655577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.655679 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.655616 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.656061 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.656043 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 23:12:34.656408 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.656389 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jht7f\"" Apr 20 23:12:34.656557 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.656530 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 23:12:34.657271 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.657255 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:34.657352 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:34.657324 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:34.657676 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.657654 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 23:12:34.657756 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.657689 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:12:34.657756 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.657722 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 23:12:34.657863 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.657854 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-cljzk\"" Apr 20 23:12:34.657983 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.657961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 23:12:34.658042 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.658000 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gxzb9\"" Apr 20 23:12:34.658042 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.658027 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 23:12:34.659081 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.659059 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.661483 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.661450 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vbkp7\"" Apr 20 23:12:34.661576 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.661524 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 23:12:34.661756 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.661736 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 23:12:34.661830 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.661796 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 23:12:34.661830 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.661821 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 23:12:34.661969 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.661952 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 23:12:34.662124 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662102 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 23:12:34.662195 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662155 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:34.662195 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-system-cni-dir\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.662296 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-cnibin\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.662296 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-var-lib-cni-bin\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.662296 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-etc-kubernetes\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.662296 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f44fca8-2437-4e45-8b93-eac1d3f54370-tmp-dir\") pod \"node-resolver-w7nck\" (UID: \"7f44fca8-2437-4e45-8b93-eac1d3f54370\") " pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.662515 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-run-k8s-cni-cncf-io\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.662515 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dngdd\" (UniqueName: \"kubernetes.io/projected/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-kube-api-access-dngdd\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.662515 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662401 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wrk\" (UniqueName: \"kubernetes.io/projected/7f44fca8-2437-4e45-8b93-eac1d3f54370-kube-api-access-26wrk\") pod \"node-resolver-w7nck\" (UID: \"7f44fca8-2437-4e45-8b93-eac1d3f54370\") " pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.662515 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-system-cni-dir\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.662515 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662451 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-cnibin\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.662515 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-os-release\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.662796 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/92cdfbdf-902b-416d-976d-04adddd35e2b-cni-binary-copy\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.662796 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr48r\" (UniqueName: \"kubernetes.io/projected/92cdfbdf-902b-416d-976d-04adddd35e2b-kube-api-access-pr48r\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.662796 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de301afe-fe9a-4095-9efc-83afb3de4d54-host-slash\") pod \"iptables-alerter-n5czg\" (UID: \"de301afe-fe9a-4095-9efc-83afb3de4d54\") " pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.662796 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8e0fc07a-cb69-4b29-bb33-219b34f8a7ef-agent-certs\") pod \"konnectivity-agent-d5rtn\" (UID: \"8e0fc07a-cb69-4b29-bb33-219b34f8a7ef\") " pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:34.662796 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8e0fc07a-cb69-4b29-bb33-219b34f8a7ef-konnectivity-ca\") pod \"konnectivity-agent-d5rtn\" (UID: \"8e0fc07a-cb69-4b29-bb33-219b34f8a7ef\") " pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:34.662796 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f44fca8-2437-4e45-8b93-eac1d3f54370-hosts-file\") pod \"node-resolver-w7nck\" (UID: \"7f44fca8-2437-4e45-8b93-eac1d3f54370\") " pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.662796 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0d58ae6-6867-491f-888f-03272f7c80e7-host\") pod \"node-ca-h9hnp\" (UID: \"f0d58ae6-6867-491f-888f-03272f7c80e7\") " pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.662796 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/92cdfbdf-902b-416d-976d-04adddd35e2b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662810 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24mmj\" (UniqueName: \"kubernetes.io/projected/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-kube-api-access-24mmj\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-var-lib-kubelet\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-hostroot\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-cni-dir\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662919 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-run-netns\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-var-lib-cni-multus\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.662997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/de301afe-fe9a-4095-9efc-83afb3de4d54-iptables-alerter-script\") pod \"iptables-alerter-n5czg\" (UID: \"de301afe-fe9a-4095-9efc-83afb3de4d54\") " pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/92cdfbdf-902b-416d-976d-04adddd35e2b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-cni-binary-copy\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-socket-dir-parent\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-conf-dir\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f0d58ae6-6867-491f-888f-03272f7c80e7-serviceca\") pod \"node-ca-h9hnp\" (UID: \"f0d58ae6-6867-491f-888f-03272f7c80e7\") " pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.663211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.663854 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-os-release\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663854 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-daemon-config\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663854 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-run-multus-certs\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.663854 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663363 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62zqq\" (UniqueName: \"kubernetes.io/projected/de301afe-fe9a-4095-9efc-83afb3de4d54-kube-api-access-62zqq\") pod \"iptables-alerter-n5czg\" (UID: \"de301afe-fe9a-4095-9efc-83afb3de4d54\") " pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.663854 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.663388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98h8\" (UniqueName: \"kubernetes.io/projected/f0d58ae6-6867-491f-888f-03272f7c80e7-kube-api-access-x98h8\") pod \"node-ca-h9hnp\" (UID: \"f0d58ae6-6867-491f-888f-03272f7c80e7\") " pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.694873 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.694847 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 23:07:33 +0000 UTC" deadline="2027-12-21 17:16:19.783969939 +0000 UTC" Apr 20 23:12:34.694873 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.694869 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14634h3m45.089103098s" Apr 20 23:12:34.751350 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.751333 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 23:12:34.763870 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.763849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-conf-dir\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.763998 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.763877 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f0d58ae6-6867-491f-888f-03272f7c80e7-serviceca\") pod \"node-ca-h9hnp\" (UID: \"f0d58ae6-6867-491f-888f-03272f7c80e7\") " pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.763998 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.763911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-slash\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.763998 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.763934 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-log-socket\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.763998 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.763959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/172d49a4-6e7e-4772-9e06-73cad0eec748-ovn-node-metrics-cert\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.763998 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.763966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-conf-dir\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.763998 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.763985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-run\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-os-release\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-run-multus-certs\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764087 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-run-multus-certs\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-os-release\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62zqq\" (UniqueName: \"kubernetes.io/projected/de301afe-fe9a-4095-9efc-83afb3de4d54-kube-api-access-62zqq\") pod \"iptables-alerter-n5czg\" (UID: \"de301afe-fe9a-4095-9efc-83afb3de4d54\") " pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x98h8\" (UniqueName: \"kubernetes.io/projected/f0d58ae6-6867-491f-888f-03272f7c80e7-kube-api-access-x98h8\") pod \"node-ca-h9hnp\" (UID: \"f0d58ae6-6867-491f-888f-03272f7c80e7\") " pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-sys-fs\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl8kw\" (UniqueName: \"kubernetes.io/projected/776a3be8-8dcf-4d7b-943b-adcc065e879d-kube-api-access-cl8kw\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-kubernetes\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f44fca8-2437-4e45-8b93-eac1d3f54370-tmp-dir\") pod \"node-resolver-w7nck\" (UID: \"7f44fca8-2437-4e45-8b93-eac1d3f54370\") " pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-system-cni-dir\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.764292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pr48r\" (UniqueName: \"kubernetes.io/projected/92cdfbdf-902b-416d-976d-04adddd35e2b-kube-api-access-pr48r\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-host\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-systemd-units\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-etc-openvswitch\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/172d49a4-6e7e-4772-9e06-73cad0eec748-env-overrides\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-os-release\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764378 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f0d58ae6-6867-491f-888f-03272f7c80e7-serviceca\") pod \"node-ca-h9hnp\" (UID: \"f0d58ae6-6867-491f-888f-03272f7c80e7\") " pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/92cdfbdf-902b-416d-976d-04adddd35e2b-cni-binary-copy\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/172d49a4-6e7e-4772-9e06-73cad0eec748-ovnkube-config\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de301afe-fe9a-4095-9efc-83afb3de4d54-host-slash\") pod \"iptables-alerter-n5czg\" (UID: \"de301afe-fe9a-4095-9efc-83afb3de4d54\") " pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8e0fc07a-cb69-4b29-bb33-219b34f8a7ef-agent-certs\") pod \"konnectivity-agent-d5rtn\" (UID: \"8e0fc07a-cb69-4b29-bb33-219b34f8a7ef\") " pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0d58ae6-6867-491f-888f-03272f7c80e7-host\") pod \"node-ca-h9hnp\" (UID: \"f0d58ae6-6867-491f-888f-03272f7c80e7\") " pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/92cdfbdf-902b-416d-976d-04adddd35e2b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-system-cni-dir\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-device-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-sysconfig\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764622 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-sys\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.764868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-node-log\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-var-lib-kubelet\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f44fca8-2437-4e45-8b93-eac1d3f54370-tmp-dir\") pod \"node-resolver-w7nck\" (UID: \"7f44fca8-2437-4e45-8b93-eac1d3f54370\") " pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-hostroot\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-tuned\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/172d49a4-6e7e-4772-9e06-73cad0eec748-ovnkube-script-lib\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/de301afe-fe9a-4095-9efc-83afb3de4d54-host-slash\") pod \"iptables-alerter-n5czg\" (UID: \"de301afe-fe9a-4095-9efc-83afb3de4d54\") " pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.764817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-os-release\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/92cdfbdf-902b-416d-976d-04adddd35e2b-cni-binary-copy\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765229 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-var-lib-kubelet\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-var-lib-cni-multus\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/92cdfbdf-902b-416d-976d-04adddd35e2b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/92cdfbdf-902b-416d-976d-04adddd35e2b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-etc-selinux\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-registration-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.765646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765490 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-sysctl-conf\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765511 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-run-netns\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765533 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-run-systemd\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765561 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-sysctl-d\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-hostroot\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.765608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-var-lib-cni-multus\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0d58ae6-6867-491f-888f-03272f7c80e7-host\") pod \"node-ca-h9hnp\" (UID: \"f0d58ae6-6867-491f-888f-03272f7c80e7\") " pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/92cdfbdf-902b-416d-976d-04adddd35e2b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-daemon-config\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-kubelet\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-cni-netd\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-system-cni-dir\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.766439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-cnibin\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-system-cni-dir\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-cnibin\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-var-lib-cni-bin\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:34.766580 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-etc-kubernetes\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766641 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-var-lib-cni-bin\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-cnibin\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-socket-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/92cdfbdf-902b-416d-976d-04adddd35e2b-cnibin\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-run-k8s-cni-cncf-io\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-daemon-config\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dngdd\" (UniqueName: \"kubernetes.io/projected/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-kube-api-access-dngdd\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:34.766770 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs podName:0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd nodeName:}" failed. No retries permitted until 2026-04-20 23:12:35.266748876 +0000 UTC m=+3.067882552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs") pod "network-metrics-daemon-qklww" (UID: "0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-etc-kubernetes\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26wrk\" (UniqueName: \"kubernetes.io/projected/7f44fca8-2437-4e45-8b93-eac1d3f54370-kube-api-access-26wrk\") pod \"node-resolver-w7nck\" (UID: \"7f44fca8-2437-4e45-8b93-eac1d3f54370\") " pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-run-ovn\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.767283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmlfj\" (UniqueName: \"kubernetes.io/projected/172d49a4-6e7e-4772-9e06-73cad0eec748-kube-api-access-pmlfj\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8e0fc07a-cb69-4b29-bb33-219b34f8a7ef-konnectivity-ca\") pod \"konnectivity-agent-d5rtn\" (UID: \"8e0fc07a-cb69-4b29-bb33-219b34f8a7ef\") " pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.766980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f44fca8-2437-4e45-8b93-eac1d3f54370-hosts-file\") pod \"node-resolver-w7nck\" (UID: \"7f44fca8-2437-4e45-8b93-eac1d3f54370\") " pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-run-openvswitch\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24mmj\" (UniqueName: \"kubernetes.io/projected/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-kube-api-access-24mmj\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-run-k8s-cni-cncf-io\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-modprobe-d\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-systemd\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0d263e2-967d-4e71-96f0-73c4215ee6c2-tmp\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrq4z\" (UniqueName: \"kubernetes.io/projected/b0d263e2-967d-4e71-96f0-73c4215ee6c2-kube-api-access-zrq4z\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-cni-dir\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-run-netns\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/de301afe-fe9a-4095-9efc-83afb3de4d54-iptables-alerter-script\") pod \"iptables-alerter-n5czg\" (UID: \"de301afe-fe9a-4095-9efc-83afb3de4d54\") " pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-var-lib-kubelet\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f44fca8-2437-4e45-8b93-eac1d3f54370-hosts-file\") pod \"node-resolver-w7nck\" (UID: \"7f44fca8-2437-4e45-8b93-eac1d3f54370\") " pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-var-lib-openvswitch\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-cni-dir\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.768089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8e0fc07a-cb69-4b29-bb33-219b34f8a7ef-konnectivity-ca\") pod \"konnectivity-agent-d5rtn\" (UID: \"8e0fc07a-cb69-4b29-bb33-219b34f8a7ef\") " pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:34.768859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-host-run-netns\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.768859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-run-ovn-kubernetes\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.768859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-cni-binary-copy\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.768859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-socket-dir-parent\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.768859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-lib-modules\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.768859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-cni-bin\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.768859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-multus-socket-dir-parent\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.768859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.767840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/de301afe-fe9a-4095-9efc-83afb3de4d54-iptables-alerter-script\") pod \"iptables-alerter-n5czg\" (UID: \"de301afe-fe9a-4095-9efc-83afb3de4d54\") " pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.768859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.768409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-cni-binary-copy\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.769266 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.768980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8e0fc07a-cb69-4b29-bb33-219b34f8a7ef-agent-certs\") pod \"konnectivity-agent-d5rtn\" (UID: \"8e0fc07a-cb69-4b29-bb33-219b34f8a7ef\") " pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:34.771727 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.771704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98h8\" (UniqueName: \"kubernetes.io/projected/f0d58ae6-6867-491f-888f-03272f7c80e7-kube-api-access-x98h8\") pod \"node-ca-h9hnp\" (UID: \"f0d58ae6-6867-491f-888f-03272f7c80e7\") " pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.772139 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.772116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr48r\" (UniqueName: \"kubernetes.io/projected/92cdfbdf-902b-416d-976d-04adddd35e2b-kube-api-access-pr48r\") pod \"multus-additional-cni-plugins-jncnx\" (UID: \"92cdfbdf-902b-416d-976d-04adddd35e2b\") " pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:34.772485 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.772439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62zqq\" (UniqueName: \"kubernetes.io/projected/de301afe-fe9a-4095-9efc-83afb3de4d54-kube-api-access-62zqq\") pod \"iptables-alerter-n5czg\" (UID: \"de301afe-fe9a-4095-9efc-83afb3de4d54\") " pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:34.776349 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.776326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dngdd\" (UniqueName: \"kubernetes.io/projected/41e804c8-cacd-4b38-ba49-6a0ee8e095cf-kube-api-access-dngdd\") pod \"multus-l4jz9\" (UID: \"41e804c8-cacd-4b38-ba49-6a0ee8e095cf\") " pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.777121 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.777101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24mmj\" (UniqueName: \"kubernetes.io/projected/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-kube-api-access-24mmj\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:34.777217 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.777200 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wrk\" (UniqueName: \"kubernetes.io/projected/7f44fca8-2437-4e45-8b93-eac1d3f54370-kube-api-access-26wrk\") pod \"node-resolver-w7nck\" (UID: \"7f44fca8-2437-4e45-8b93-eac1d3f54370\") " pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.783952 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.783898 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal" event={"ID":"0f860586cfb743e565374c69520bb765","Type":"ContainerStarted","Data":"5a7053d1176708e4739b3313c61703e80c831273ef1f12eb9eee6bad20d4ec87"} Apr 20 23:12:34.784907 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.784883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" event={"ID":"2caee1c055f23a368d6b80867e13d8e0","Type":"ContainerStarted","Data":"44ec370828c24932ad0509c7804c15a09b9460fccfc7acf1f10f83a930723681"} Apr 20 23:12:34.868335 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-registration-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.868444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-sysctl-conf\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.868444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-run-netns\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-run-systemd\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-registration-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.868444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-sysctl-d\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.868444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-kubelet\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-run-systemd\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-cni-netd\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868491 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-run-netns\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-cni-netd\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-kubelet\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868550 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-socket-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-sysctl-d\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-run-ovn\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmlfj\" (UniqueName: \"kubernetes.io/projected/172d49a4-6e7e-4772-9e06-73cad0eec748-kube-api-access-pmlfj\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-sysctl-conf\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-run-openvswitch\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-modprobe-d\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-run-ovn\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-systemd\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.868741 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-socket-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868697 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0d263e2-967d-4e71-96f0-73c4215ee6c2-tmp\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-run-openvswitch\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrq4z\" (UniqueName: \"kubernetes.io/projected/b0d263e2-967d-4e71-96f0-73c4215ee6c2-kube-api-access-zrq4z\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-systemd\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-var-lib-kubelet\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-var-lib-openvswitch\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-modprobe-d\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-run-ovn-kubernetes\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-var-lib-kubelet\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-lib-modules\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-cni-bin\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-var-lib-openvswitch\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-slash\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-log-socket\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/172d49a4-6e7e-4772-9e06-73cad0eec748-ovn-node-metrics-cert\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-cni-bin\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-run\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.869436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-lib-modules\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-slash\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868963 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-log-socket\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-sys-fs\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.868981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-host-run-ovn-kubernetes\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-sys-fs\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl8kw\" (UniqueName: \"kubernetes.io/projected/776a3be8-8dcf-4d7b-943b-adcc065e879d-kube-api-access-cl8kw\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-run\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-kubernetes\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-host\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-systemd-units\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-etc-openvswitch\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869107 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-host\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869107 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-kubernetes\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-systemd-units\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-etc-openvswitch\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/172d49a4-6e7e-4772-9e06-73cad0eec748-env-overrides\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.870444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/172d49a4-6e7e-4772-9e06-73cad0eec748-ovnkube-config\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-device-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-sysconfig\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-sys\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-node-log\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-tuned\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/172d49a4-6e7e-4772-9e06-73cad0eec748-ovnkube-script-lib\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869407 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-etc-selinux\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-sysconfig\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-device-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/776a3be8-8dcf-4d7b-943b-adcc065e879d-etc-selinux\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/172d49a4-6e7e-4772-9e06-73cad0eec748-node-log\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0d263e2-967d-4e71-96f0-73c4215ee6c2-sys\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.869772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/172d49a4-6e7e-4772-9e06-73cad0eec748-ovnkube-config\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.871258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.870072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/172d49a4-6e7e-4772-9e06-73cad0eec748-env-overrides\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.872014 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.870157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/172d49a4-6e7e-4772-9e06-73cad0eec748-ovnkube-script-lib\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.872014 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.871174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b0d263e2-967d-4e71-96f0-73c4215ee6c2-tmp\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.872014 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.871362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/172d49a4-6e7e-4772-9e06-73cad0eec748-ovn-node-metrics-cert\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.872014 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.871921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b0d263e2-967d-4e71-96f0-73c4215ee6c2-etc-tuned\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.874354 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:34.874297 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:34.874354 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:34.874321 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:34.874354 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:34.874336 2575 projected.go:194] Error preparing data for projected volume kube-api-access-58k26 for pod openshift-network-diagnostics/network-check-target-b9mzc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:34.874624 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:34.874410 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26 podName:45a7ee7b-b744-4b77-bc49-38abb3429332 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:35.374391568 +0000 UTC m=+3.175525260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-58k26" (UniqueName: "kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26") pod "network-check-target-b9mzc" (UID: "45a7ee7b-b744-4b77-bc49-38abb3429332") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:34.876533 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.876515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrq4z\" (UniqueName: \"kubernetes.io/projected/b0d263e2-967d-4e71-96f0-73c4215ee6c2-kube-api-access-zrq4z\") pod \"tuned-cqf4v\" (UID: \"b0d263e2-967d-4e71-96f0-73c4215ee6c2\") " pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:34.877190 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.877165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmlfj\" (UniqueName: \"kubernetes.io/projected/172d49a4-6e7e-4772-9e06-73cad0eec748-kube-api-access-pmlfj\") pod \"ovnkube-node-lkbxg\" (UID: \"172d49a4-6e7e-4772-9e06-73cad0eec748\") " pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:34.877269 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.877189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl8kw\" (UniqueName: \"kubernetes.io/projected/776a3be8-8dcf-4d7b-943b-adcc065e879d-kube-api-access-cl8kw\") pod \"aws-ebs-csi-driver-node-8wzx7\" (UID: \"776a3be8-8dcf-4d7b-943b-adcc065e879d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:34.936841 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.936792 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:34.950804 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.950780 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:34.963765 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.963742 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-l4jz9" Apr 20 23:12:34.974969 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.974944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w7nck" Apr 20 23:12:34.986105 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.986086 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h9hnp" Apr 20 23:12:34.993075 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:34.993058 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jncnx" Apr 20 23:12:35.000638 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.000615 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n5czg" Apr 20 23:12:35.007246 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.007228 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:35.016829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.016811 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" Apr 20 23:12:35.024397 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.024379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" Apr 20 23:12:35.029075 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.029051 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:35.273312 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.272211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:35.273312 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:35.272377 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:35.273312 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:35.272440 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs podName:0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd nodeName:}" failed. No retries permitted until 2026-04-20 23:12:36.27242095 +0000 UTC m=+4.073554637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs") pod "network-metrics-daemon-qklww" (UID: "0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:35.473568 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.473533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:35.473746 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:35.473706 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:35.473746 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:35.473731 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:35.473746 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:35.473743 2575 projected.go:194] Error preparing data for projected volume kube-api-access-58k26 for pod openshift-network-diagnostics/network-check-target-b9mzc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:35.473882 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:35.473809 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26 podName:45a7ee7b-b744-4b77-bc49-38abb3429332 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:36.473786902 +0000 UTC m=+4.274920580 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-58k26" (UniqueName: "kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26") pod "network-check-target-b9mzc" (UID: "45a7ee7b-b744-4b77-bc49-38abb3429332") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:35.512777 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:35.512754 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e0fc07a_cb69_4b29_bb33_219b34f8a7ef.slice/crio-c1284e597e809395535222bd43e00415d44a4fda2d056023342a9626e063cd5b WatchSource:0}: Error finding container c1284e597e809395535222bd43e00415d44a4fda2d056023342a9626e063cd5b: Status 404 returned error can't find the container with id c1284e597e809395535222bd43e00415d44a4fda2d056023342a9626e063cd5b Apr 20 23:12:35.525521 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:35.525346 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde301afe_fe9a_4095_9efc_83afb3de4d54.slice/crio-d5ecd15d1d119025a071755b46dee21431c81a2612490116d1494332e3bc79b6 WatchSource:0}: Error finding container d5ecd15d1d119025a071755b46dee21431c81a2612490116d1494332e3bc79b6: Status 404 returned error can't find the container with id d5ecd15d1d119025a071755b46dee21431c81a2612490116d1494332e3bc79b6 Apr 20 23:12:35.526867 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:35.526781 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d58ae6_6867_491f_888f_03272f7c80e7.slice/crio-934c8a621dab3c1af0acf80dab17582413c0bba6bbd075c3233e7c995801f351 WatchSource:0}: Error finding container 934c8a621dab3c1af0acf80dab17582413c0bba6bbd075c3233e7c995801f351: Status 404 returned error can't find the container with id 934c8a621dab3c1af0acf80dab17582413c0bba6bbd075c3233e7c995801f351 Apr 20 23:12:35.527507 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:35.527489 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92cdfbdf_902b_416d_976d_04adddd35e2b.slice/crio-fb5671fb6732170629eb5e978351da2eaf5cafd60543004c42c0295f50fd908f WatchSource:0}: Error finding container fb5671fb6732170629eb5e978351da2eaf5cafd60543004c42c0295f50fd908f: Status 404 returned error can't find the container with id fb5671fb6732170629eb5e978351da2eaf5cafd60543004c42c0295f50fd908f Apr 20 23:12:35.533069 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:35.533047 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod776a3be8_8dcf_4d7b_943b_adcc065e879d.slice/crio-d2d7e097257f5e8b5ee5547b640fe7c224cb4239c8aacab6454d4c767a3f1ab1 WatchSource:0}: Error finding container d2d7e097257f5e8b5ee5547b640fe7c224cb4239c8aacab6454d4c767a3f1ab1: Status 404 returned error can't find the container with id d2d7e097257f5e8b5ee5547b640fe7c224cb4239c8aacab6454d4c767a3f1ab1 Apr 20 23:12:35.533888 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:35.533828 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f44fca8_2437_4e45_8b93_eac1d3f54370.slice/crio-0d4fe1c4f754227f61c9570d90abd4387d44b2faae8755a9cc8dfcb4250ad03d WatchSource:0}: Error finding container 0d4fe1c4f754227f61c9570d90abd4387d44b2faae8755a9cc8dfcb4250ad03d: Status 404 returned error can't find the container with id 0d4fe1c4f754227f61c9570d90abd4387d44b2faae8755a9cc8dfcb4250ad03d Apr 20 23:12:35.534591 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:35.534567 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d263e2_967d_4e71_96f0_73c4215ee6c2.slice/crio-23c179b48d46182193f7f414a90c53fefe94a149601ae36b9722e21d6e26db9f WatchSource:0}: Error finding container 23c179b48d46182193f7f414a90c53fefe94a149601ae36b9722e21d6e26db9f: Status 404 returned error can't find the container with id 23c179b48d46182193f7f414a90c53fefe94a149601ae36b9722e21d6e26db9f Apr 20 23:12:35.535487 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:35.535393 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod172d49a4_6e7e_4772_9e06_73cad0eec748.slice/crio-c1cfd5887a04866cb99e0d9b8392e4a3f91ee777ab53d4466ac8297999d7ba56 WatchSource:0}: Error finding container c1cfd5887a04866cb99e0d9b8392e4a3f91ee777ab53d4466ac8297999d7ba56: Status 404 returned error can't find the container with id c1cfd5887a04866cb99e0d9b8392e4a3f91ee777ab53d4466ac8297999d7ba56 Apr 20 23:12:35.537442 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:12:35.537418 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41e804c8_cacd_4b38_ba49_6a0ee8e095cf.slice/crio-5765db38088e11bdfe7983da70746bc25a3dcf9d28e4e6bf67a13f3045d363f0 WatchSource:0}: Error finding container 5765db38088e11bdfe7983da70746bc25a3dcf9d28e4e6bf67a13f3045d363f0: Status 404 returned error can't find the container with id 5765db38088e11bdfe7983da70746bc25a3dcf9d28e4e6bf67a13f3045d363f0 Apr 20 23:12:35.695398 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.695372 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 23:07:33 +0000 UTC" deadline="2028-02-02 23:13:45.41955087 +0000 UTC" Apr 20 23:12:35.695398 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.695396 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15672h1m9.724158106s" Apr 20 23:12:35.788101 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.788014 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal" event={"ID":"0f860586cfb743e565374c69520bb765","Type":"ContainerStarted","Data":"ca36ecba44d9de269d2fac672cd4499182b22bcbb14749cf821a00e527b57769"} Apr 20 23:12:35.790429 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.790402 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" event={"ID":"b0d263e2-967d-4e71-96f0-73c4215ee6c2","Type":"ContainerStarted","Data":"23c179b48d46182193f7f414a90c53fefe94a149601ae36b9722e21d6e26db9f"} Apr 20 23:12:35.792201 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.792169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" event={"ID":"172d49a4-6e7e-4772-9e06-73cad0eec748","Type":"ContainerStarted","Data":"c1cfd5887a04866cb99e0d9b8392e4a3f91ee777ab53d4466ac8297999d7ba56"} Apr 20 23:12:35.793734 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.793715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7nck" event={"ID":"7f44fca8-2437-4e45-8b93-eac1d3f54370","Type":"ContainerStarted","Data":"0d4fe1c4f754227f61c9570d90abd4387d44b2faae8755a9cc8dfcb4250ad03d"} Apr 20 23:12:35.795641 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.795623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l4jz9" event={"ID":"41e804c8-cacd-4b38-ba49-6a0ee8e095cf","Type":"ContainerStarted","Data":"5765db38088e11bdfe7983da70746bc25a3dcf9d28e4e6bf67a13f3045d363f0"} Apr 20 23:12:35.796587 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.796560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" event={"ID":"776a3be8-8dcf-4d7b-943b-adcc065e879d","Type":"ContainerStarted","Data":"d2d7e097257f5e8b5ee5547b640fe7c224cb4239c8aacab6454d4c767a3f1ab1"} Apr 20 23:12:35.797779 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.797758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jncnx" event={"ID":"92cdfbdf-902b-416d-976d-04adddd35e2b","Type":"ContainerStarted","Data":"fb5671fb6732170629eb5e978351da2eaf5cafd60543004c42c0295f50fd908f"} Apr 20 23:12:35.798690 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.798673 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n5czg" event={"ID":"de301afe-fe9a-4095-9efc-83afb3de4d54","Type":"ContainerStarted","Data":"d5ecd15d1d119025a071755b46dee21431c81a2612490116d1494332e3bc79b6"} Apr 20 23:12:35.799562 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.799543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h9hnp" event={"ID":"f0d58ae6-6867-491f-888f-03272f7c80e7","Type":"ContainerStarted","Data":"934c8a621dab3c1af0acf80dab17582413c0bba6bbd075c3233e7c995801f351"} Apr 20 23:12:35.800301 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.800257 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-251.ec2.internal" podStartSLOduration=1.800244315 podStartE2EDuration="1.800244315s" podCreationTimestamp="2026-04-20 23:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:12:35.800238827 +0000 UTC m=+3.601372525" watchObservedRunningTime="2026-04-20 23:12:35.800244315 +0000 UTC m=+3.601378008" Apr 20 23:12:35.800503 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:35.800484 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d5rtn" event={"ID":"8e0fc07a-cb69-4b29-bb33-219b34f8a7ef","Type":"ContainerStarted","Data":"c1284e597e809395535222bd43e00415d44a4fda2d056023342a9626e063cd5b"} Apr 20 23:12:36.278997 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:36.278854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:36.279158 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:36.279016 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:36.279158 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:36.279087 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs podName:0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd nodeName:}" failed. No retries permitted until 2026-04-20 23:12:38.279067725 +0000 UTC m=+6.080201400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs") pod "network-metrics-daemon-qklww" (UID: "0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:36.480163 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:36.480130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:36.480347 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:36.480329 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:36.480419 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:36.480357 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:36.480419 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:36.480370 2575 projected.go:194] Error preparing data for projected volume kube-api-access-58k26 for pod openshift-network-diagnostics/network-check-target-b9mzc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:36.480546 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:36.480425 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26 podName:45a7ee7b-b744-4b77-bc49-38abb3429332 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:38.480404824 +0000 UTC m=+6.281538512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-58k26" (UniqueName: "kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26") pod "network-check-target-b9mzc" (UID: "45a7ee7b-b744-4b77-bc49-38abb3429332") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:36.645137 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:36.644797 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:12:36.779708 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:36.778239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:36.779708 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:36.778378 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:36.779708 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:36.778786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:36.779708 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:36.778886 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:36.819547 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:36.817574 2575 generic.go:358] "Generic (PLEG): container finished" podID="2caee1c055f23a368d6b80867e13d8e0" containerID="3cbebd8b511dcbe9b80265cfa75d4e8101f02b4f7f56bebb910ef4cee91f20da" exitCode=0 Apr 20 23:12:36.819547 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:36.818347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" event={"ID":"2caee1c055f23a368d6b80867e13d8e0","Type":"ContainerDied","Data":"3cbebd8b511dcbe9b80265cfa75d4e8101f02b4f7f56bebb910ef4cee91f20da"} Apr 20 23:12:37.831590 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:37.831551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" event={"ID":"2caee1c055f23a368d6b80867e13d8e0","Type":"ContainerStarted","Data":"ed8eef41ec323ff99f5b7c887d0f0ee21e9e1aac628ea30992ef91f59468359d"} Apr 20 23:12:37.847505 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:37.847436 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-251.ec2.internal" podStartSLOduration=3.847418027 podStartE2EDuration="3.847418027s" podCreationTimestamp="2026-04-20 23:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:12:37.846663999 +0000 UTC m=+5.647797697" watchObservedRunningTime="2026-04-20 23:12:37.847418027 +0000 UTC m=+5.648551724" Apr 20 23:12:38.295010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:38.294929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:38.295164 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:38.295100 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:38.295164 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:38.295163 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs podName:0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd nodeName:}" failed. No retries permitted until 2026-04-20 23:12:42.295143583 +0000 UTC m=+10.096277273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs") pod "network-metrics-daemon-qklww" (UID: "0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:38.497065 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:38.497024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:38.506635 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:38.506057 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:38.506635 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:38.506095 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:38.506635 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:38.506112 2575 projected.go:194] Error preparing data for projected volume kube-api-access-58k26 for pod openshift-network-diagnostics/network-check-target-b9mzc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:38.506635 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:38.506187 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26 podName:45a7ee7b-b744-4b77-bc49-38abb3429332 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:42.506165283 +0000 UTC m=+10.307298969 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-58k26" (UniqueName: "kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26") pod "network-check-target-b9mzc" (UID: "45a7ee7b-b744-4b77-bc49-38abb3429332") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:38.779738 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:38.779628 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:38.779885 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:38.779777 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:38.780145 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:38.779978 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:38.780145 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:38.780102 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:40.777937 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:40.777901 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:40.778362 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:40.777939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:40.778362 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:40.778043 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:40.778362 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:40.778159 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:42.329989 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:42.329948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:42.330358 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:42.330119 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:42.330358 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:42.330183 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs podName:0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd nodeName:}" failed. No retries permitted until 2026-04-20 23:12:50.330162354 +0000 UTC m=+18.131296034 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs") pod "network-metrics-daemon-qklww" (UID: "0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:42.532147 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:42.532111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:42.532338 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:42.532293 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:42.532338 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:42.532309 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:42.532338 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:42.532318 2575 projected.go:194] Error preparing data for projected volume kube-api-access-58k26 for pod openshift-network-diagnostics/network-check-target-b9mzc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:42.532508 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:42.532358 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26 podName:45a7ee7b-b744-4b77-bc49-38abb3429332 nodeName:}" failed. No retries permitted until 2026-04-20 23:12:50.532345427 +0000 UTC m=+18.333479102 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-58k26" (UniqueName: "kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26") pod "network-check-target-b9mzc" (UID: "45a7ee7b-b744-4b77-bc49-38abb3429332") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:42.780022 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:42.779936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:42.782493 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:42.780984 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:42.783817 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:42.783600 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:42.783817 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:42.783765 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:44.778943 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:44.778282 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:44.778943 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:44.778415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:44.778943 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:44.778808 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:44.778943 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:44.778898 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:46.778436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:46.778402 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:46.778877 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:46.778438 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:46.778877 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:46.778546 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:46.778877 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:46.778705 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:48.778637 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:48.778597 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:48.779063 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:48.778597 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:48.779063 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:48.778746 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:48.779063 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:48.778853 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:50.392386 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:50.392353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:50.392921 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:50.392516 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:50.392921 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:50.392592 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs podName:0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd nodeName:}" failed. No retries permitted until 2026-04-20 23:13:06.392571631 +0000 UTC m=+34.193705327 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs") pod "network-metrics-daemon-qklww" (UID: "0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:12:50.593500 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:50.593441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:50.593671 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:50.593605 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:12:50.593671 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:50.593629 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:12:50.593671 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:50.593644 2575 projected.go:194] Error preparing data for projected volume kube-api-access-58k26 for pod openshift-network-diagnostics/network-check-target-b9mzc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:50.593852 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:50.593700 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26 podName:45a7ee7b-b744-4b77-bc49-38abb3429332 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:06.593678888 +0000 UTC m=+34.394812565 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-58k26" (UniqueName: "kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26") pod "network-check-target-b9mzc" (UID: "45a7ee7b-b744-4b77-bc49-38abb3429332") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:12:50.777959 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:50.777890 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:50.778076 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:50.777893 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:50.778076 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:50.778008 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:50.778155 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:50.778098 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:52.780600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.779455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:52.780600 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:52.779886 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:52.780600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.780251 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:52.780600 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:52.780331 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:52.858459 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.858321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-d5rtn" event={"ID":"8e0fc07a-cb69-4b29-bb33-219b34f8a7ef","Type":"ContainerStarted","Data":"8bdf879bbb113adcfa04c9f2ef10d3230038f7e85499fd0befbb5a3487e26961"} Apr 20 23:12:52.859812 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.859682 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" event={"ID":"b0d263e2-967d-4e71-96f0-73c4215ee6c2","Type":"ContainerStarted","Data":"17730b12c520e0b7b4633073bbe01d1124df5c6da86962e4b8377bd417891c1a"} Apr 20 23:12:52.861808 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.861786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w7nck" event={"ID":"7f44fca8-2437-4e45-8b93-eac1d3f54370","Type":"ContainerStarted","Data":"ef9eaf504343bede1949f74a16341d7acb5dd17a63ad0fb9ae22003a276a17f4"} Apr 20 23:12:52.863045 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.863025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" event={"ID":"776a3be8-8dcf-4d7b-943b-adcc065e879d","Type":"ContainerStarted","Data":"a16351fbdefa99673b9f627d7bb297352c96d8bcfc2975346398428611284230"} Apr 20 23:12:52.864703 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.864680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h9hnp" event={"ID":"f0d58ae6-6867-491f-888f-03272f7c80e7","Type":"ContainerStarted","Data":"acf914590ff1582b0429829ddf89d5e0d6079399480b1ec0ab8aa25a556ee7a4"} Apr 20 23:12:52.873872 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.873837 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-d5rtn" podStartSLOduration=2.842003536 podStartE2EDuration="19.87382746s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.524267022 +0000 UTC m=+3.325400696" lastFinishedPulling="2026-04-20 23:12:52.556090946 +0000 UTC m=+20.357224620" observedRunningTime="2026-04-20 23:12:52.873227928 +0000 UTC m=+20.674361623" watchObservedRunningTime="2026-04-20 23:12:52.87382746 +0000 UTC m=+20.674961156" Apr 20 23:12:52.888133 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.888092 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w7nck" podStartSLOduration=3.867676672 podStartE2EDuration="20.888082026s" podCreationTimestamp="2026-04-20 23:12:32 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.535681941 +0000 UTC m=+3.336815630" lastFinishedPulling="2026-04-20 23:12:52.556087309 +0000 UTC m=+20.357220984" observedRunningTime="2026-04-20 23:12:52.887669803 +0000 UTC m=+20.688803508" watchObservedRunningTime="2026-04-20 23:12:52.888082026 +0000 UTC m=+20.689215723" Apr 20 23:12:52.903410 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.903368 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h9hnp" podStartSLOduration=3.876864249 podStartE2EDuration="20.903352447s" podCreationTimestamp="2026-04-20 23:12:32 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.529541006 +0000 UTC m=+3.330674680" lastFinishedPulling="2026-04-20 23:12:52.556029201 +0000 UTC m=+20.357162878" observedRunningTime="2026-04-20 23:12:52.902797691 +0000 UTC m=+20.703931388" watchObservedRunningTime="2026-04-20 23:12:52.903352447 +0000 UTC m=+20.704486143" Apr 20 23:12:52.920942 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:52.920885 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cqf4v" podStartSLOduration=2.901135562 podStartE2EDuration="19.920865896s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.537691526 +0000 UTC m=+3.338825201" lastFinishedPulling="2026-04-20 23:12:52.557421859 +0000 UTC m=+20.358555535" observedRunningTime="2026-04-20 23:12:52.92004769 +0000 UTC m=+20.721181389" watchObservedRunningTime="2026-04-20 23:12:52.920865896 +0000 UTC m=+20.721999665" Apr 20 23:12:53.869131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.869047 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" event={"ID":"172d49a4-6e7e-4772-9e06-73cad0eec748","Type":"ContainerStarted","Data":"908b8256d0efb4528182329f4f5c07e21f258402be439f2150517327fccb919a"} Apr 20 23:12:53.869131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.869089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" event={"ID":"172d49a4-6e7e-4772-9e06-73cad0eec748","Type":"ContainerStarted","Data":"8b0a16657825e6980fcf2ab5f5ed37626f4d90a38e22b1775bdae86a8e3e7d05"} Apr 20 23:12:53.869131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.869101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" event={"ID":"172d49a4-6e7e-4772-9e06-73cad0eec748","Type":"ContainerStarted","Data":"592d7a0b57293d67c83cabfefbf6d5ce663a823faf521993d3b511c231a71f1b"} Apr 20 23:12:53.869131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.869110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" event={"ID":"172d49a4-6e7e-4772-9e06-73cad0eec748","Type":"ContainerStarted","Data":"c2791555a1922758b727c675a8615cfb95d84acd3d17dfc0408ac12a11fd1991"} Apr 20 23:12:53.869131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.869123 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" event={"ID":"172d49a4-6e7e-4772-9e06-73cad0eec748","Type":"ContainerStarted","Data":"6269060ae3faf774b7f05b60d7b64c748caa10cbca57bca7092b6a955f434347"} Apr 20 23:12:53.869131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.869136 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" event={"ID":"172d49a4-6e7e-4772-9e06-73cad0eec748","Type":"ContainerStarted","Data":"266af88b84a00e69332eab155d6297935cf0be33851089d46635353030ace94a"} Apr 20 23:12:53.870443 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.870416 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l4jz9" event={"ID":"41e804c8-cacd-4b38-ba49-6a0ee8e095cf","Type":"ContainerStarted","Data":"a42618a06a407dd418c3a6bdd7b16cd5085bcc58d62f30bdbb675978aa7c3059"} Apr 20 23:12:53.871814 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.871783 2575 generic.go:358] "Generic (PLEG): container finished" podID="92cdfbdf-902b-416d-976d-04adddd35e2b" containerID="30390978f2b0cfcbce3f5a7c484edda8311a92c34778a6628468b7c34f5cab1e" exitCode=0 Apr 20 23:12:53.871943 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.871910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jncnx" event={"ID":"92cdfbdf-902b-416d-976d-04adddd35e2b","Type":"ContainerDied","Data":"30390978f2b0cfcbce3f5a7c484edda8311a92c34778a6628468b7c34f5cab1e"} Apr 20 23:12:53.887419 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.887378 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-l4jz9" podStartSLOduration=4.615164041 podStartE2EDuration="21.887359568s" podCreationTimestamp="2026-04-20 23:12:32 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.538898918 +0000 UTC m=+3.340032592" lastFinishedPulling="2026-04-20 23:12:52.811094435 +0000 UTC m=+20.612228119" observedRunningTime="2026-04-20 23:12:53.885832804 +0000 UTC m=+21.686966499" watchObservedRunningTime="2026-04-20 23:12:53.887359568 +0000 UTC m=+21.688493263" Apr 20 23:12:53.970435 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:53.970414 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 23:12:54.721971 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:54.721864 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T23:12:53.970431914Z","UUID":"f7a91ae9-b469-45cc-90ef-43954ca97334","Handler":null,"Name":"","Endpoint":""} Apr 20 23:12:54.723615 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:54.723592 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 23:12:54.723739 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:54.723625 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 23:12:54.778618 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:54.778587 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:54.778771 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:54.778701 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:54.779184 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:54.779164 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:54.779296 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:54.779274 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:54.875431 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:54.875395 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" event={"ID":"776a3be8-8dcf-4d7b-943b-adcc065e879d","Type":"ContainerStarted","Data":"fd2d70890c1bb1c66f98607a63d66640179afe0c6d6d3d3426ae6ec5780b43c0"} Apr 20 23:12:54.876756 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:54.876729 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n5czg" event={"ID":"de301afe-fe9a-4095-9efc-83afb3de4d54","Type":"ContainerStarted","Data":"9bca1a0d438bfd2042d31aeceb13ec2673b2ef33e9f202722e58fa847b52010a"} Apr 20 23:12:54.890703 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:54.890663 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-n5czg" podStartSLOduration=4.861665053 podStartE2EDuration="21.890651219s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.527082967 +0000 UTC m=+3.328216641" lastFinishedPulling="2026-04-20 23:12:52.556069119 +0000 UTC m=+20.357202807" observedRunningTime="2026-04-20 23:12:54.890075496 +0000 UTC m=+22.691209194" watchObservedRunningTime="2026-04-20 23:12:54.890651219 +0000 UTC m=+22.691784914" Apr 20 23:12:55.881184 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:55.880975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" event={"ID":"776a3be8-8dcf-4d7b-943b-adcc065e879d","Type":"ContainerStarted","Data":"5770ec3f0eb6884d749200bdf173157bef0d7c11e94368d4a3980d34f31b698e"} Apr 20 23:12:55.884583 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:55.884552 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" event={"ID":"172d49a4-6e7e-4772-9e06-73cad0eec748","Type":"ContainerStarted","Data":"77ab150e5b28805c456ac4f8a5b427c3e4f8eb065bec105051fd5a761e980d69"} Apr 20 23:12:55.898944 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:55.898896 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8wzx7" podStartSLOduration=3.40549077 podStartE2EDuration="22.898882069s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.534705492 +0000 UTC m=+3.335839172" lastFinishedPulling="2026-04-20 23:12:55.028096795 +0000 UTC m=+22.829230471" observedRunningTime="2026-04-20 23:12:55.898822949 +0000 UTC m=+23.699956645" watchObservedRunningTime="2026-04-20 23:12:55.898882069 +0000 UTC m=+23.700015764" Apr 20 23:12:56.003579 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:56.003551 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:56.004248 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:56.004228 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:56.777962 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:56.777936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:56.778095 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:56.778047 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:56.778388 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:56.778374 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:56.778460 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:56.778444 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:56.886282 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:56.886254 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:56.886941 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:56.886922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-d5rtn" Apr 20 23:12:58.778791 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:58.778547 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:12:58.779345 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:58.778590 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:12:58.779345 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:58.778894 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:12:58.779345 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:12:58.778940 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:12:58.892518 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:58.892453 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" event={"ID":"172d49a4-6e7e-4772-9e06-73cad0eec748","Type":"ContainerStarted","Data":"2d2d1a3e956f2600c9eb0375a3bf6d17dad43b6c8d2638a9a6b2586eceec260a"} Apr 20 23:12:58.892848 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:58.892828 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:58.892848 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:58.892854 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:58.894428 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:58.894406 2575 generic.go:358] "Generic (PLEG): container finished" podID="92cdfbdf-902b-416d-976d-04adddd35e2b" containerID="bcbd3b443984f9c67508c58e4e1b2d5bab39eec435f3fe1bb6f965fc284104b6" exitCode=0 Apr 20 23:12:58.894550 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:58.894495 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jncnx" event={"ID":"92cdfbdf-902b-416d-976d-04adddd35e2b","Type":"ContainerDied","Data":"bcbd3b443984f9c67508c58e4e1b2d5bab39eec435f3fe1bb6f965fc284104b6"} Apr 20 23:12:58.908056 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:58.908036 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:58.918600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:58.918562 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" podStartSLOduration=8.647092076 podStartE2EDuration="25.918548927s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.53907629 +0000 UTC m=+3.340209964" lastFinishedPulling="2026-04-20 23:12:52.810533134 +0000 UTC m=+20.611666815" observedRunningTime="2026-04-20 23:12:58.917006691 +0000 UTC m=+26.718140387" watchObservedRunningTime="2026-04-20 23:12:58.918548927 +0000 UTC m=+26.719682619" Apr 20 23:12:59.897962 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:59.897934 2575 generic.go:358] "Generic (PLEG): container finished" podID="92cdfbdf-902b-416d-976d-04adddd35e2b" containerID="bfa5b8360ecf12b5bf33eea0b54583d0147e96c431ed9d18c31648d82f939209" exitCode=0 Apr 20 23:12:59.898251 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:59.898009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jncnx" event={"ID":"92cdfbdf-902b-416d-976d-04adddd35e2b","Type":"ContainerDied","Data":"bfa5b8360ecf12b5bf33eea0b54583d0147e96c431ed9d18c31648d82f939209"} Apr 20 23:12:59.899100 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:59.898604 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:12:59.913947 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:12:59.913925 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:13:00.778540 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:00.778486 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:00.778540 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:00.778497 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:00.778671 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:00.778585 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:13:00.778754 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:00.778732 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:13:00.901206 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:00.901183 2575 generic.go:358] "Generic (PLEG): container finished" podID="92cdfbdf-902b-416d-976d-04adddd35e2b" containerID="634c08ec32fc6a46efdd758086bb74073d7f884368fe04ef48470ba97554db2a" exitCode=0 Apr 20 23:13:00.901509 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:00.901282 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jncnx" event={"ID":"92cdfbdf-902b-416d-976d-04adddd35e2b","Type":"ContainerDied","Data":"634c08ec32fc6a46efdd758086bb74073d7f884368fe04ef48470ba97554db2a"} Apr 20 23:13:02.779385 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:02.779354 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:02.779834 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:02.779447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:02.779834 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:02.779453 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:13:02.779834 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:02.779566 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:13:04.778481 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:04.778441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:04.778883 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:04.778491 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:04.778883 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:04.778556 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:13:04.778883 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:04.778676 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:13:06.413981 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:06.413945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:06.414557 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:06.414110 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:06.414557 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:06.414185 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs podName:0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd nodeName:}" failed. No retries permitted until 2026-04-20 23:13:38.414166541 +0000 UTC m=+66.215300217 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs") pod "network-metrics-daemon-qklww" (UID: "0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:06.616173 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:06.616138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:06.616376 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:06.616254 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:13:06.616376 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:06.616277 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:13:06.616376 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:06.616290 2575 projected.go:194] Error preparing data for projected volume kube-api-access-58k26 for pod openshift-network-diagnostics/network-check-target-b9mzc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:06.616376 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:06.616340 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26 podName:45a7ee7b-b744-4b77-bc49-38abb3429332 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:38.61632643 +0000 UTC m=+66.417460107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-58k26" (UniqueName: "kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26") pod "network-check-target-b9mzc" (UID: "45a7ee7b-b744-4b77-bc49-38abb3429332") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:06.778634 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:06.778559 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:06.778634 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:06.778560 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:06.778848 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:06.778712 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:13:06.778848 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:06.778787 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:13:07.915989 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:07.915813 2575 generic.go:358] "Generic (PLEG): container finished" podID="92cdfbdf-902b-416d-976d-04adddd35e2b" containerID="0b330aea84b86d4837e6193156395a2f7dd16811f44162ccd2375da339482100" exitCode=0 Apr 20 23:13:07.915989 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:07.915907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jncnx" event={"ID":"92cdfbdf-902b-416d-976d-04adddd35e2b","Type":"ContainerDied","Data":"0b330aea84b86d4837e6193156395a2f7dd16811f44162ccd2375da339482100"} Apr 20 23:13:08.778534 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:08.778511 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:08.778699 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:08.778518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:08.778699 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:08.778607 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:13:08.778699 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:08.778672 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:13:08.920503 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:08.920454 2575 generic.go:358] "Generic (PLEG): container finished" podID="92cdfbdf-902b-416d-976d-04adddd35e2b" containerID="9d105fabe3c88321b4676ec6efbac70f9dc98dc1d043fc5eafd6a4e5bf87db8c" exitCode=0 Apr 20 23:13:08.920503 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:08.920494 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jncnx" event={"ID":"92cdfbdf-902b-416d-976d-04adddd35e2b","Type":"ContainerDied","Data":"9d105fabe3c88321b4676ec6efbac70f9dc98dc1d043fc5eafd6a4e5bf87db8c"} Apr 20 23:13:09.926775 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:09.926741 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jncnx" event={"ID":"92cdfbdf-902b-416d-976d-04adddd35e2b","Type":"ContainerStarted","Data":"b177419039e928f557a5410b6c2b738da0366d26c2b453932d6846084ff26659"} Apr 20 23:13:09.956192 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:09.956138 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jncnx" podStartSLOduration=5.041817191 podStartE2EDuration="36.956122213s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:12:35.529411045 +0000 UTC m=+3.330544720" lastFinishedPulling="2026-04-20 23:13:07.443716065 +0000 UTC m=+35.244849742" observedRunningTime="2026-04-20 23:13:09.955532954 +0000 UTC m=+37.756666651" watchObservedRunningTime="2026-04-20 23:13:09.956122213 +0000 UTC m=+37.757255908" Apr 20 23:13:10.181888 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:10.181813 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qklww"] Apr 20 23:13:10.182011 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:10.181959 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:10.182103 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:10.182080 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:13:10.184178 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:10.184150 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b9mzc"] Apr 20 23:13:10.184279 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:10.184264 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:10.184374 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:10.184356 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:13:10.990205 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:10.990178 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xs2h6"] Apr 20 23:13:10.993256 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:10.993236 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:10.993377 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:10.993302 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xs2h6" podUID="edb5064a-5a9e-425a-8ba4-f78d68e1f2a8" Apr 20 23:13:11.004653 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.004435 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xs2h6"] Apr 20 23:13:11.151880 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.151816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-kubelet-config\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.151880 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.151856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.151880 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.151875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-dbus\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.252950 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.252928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.253058 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.252955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-dbus\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.253106 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.253054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-kubelet-config\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.253106 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:11.253061 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 23:13:11.253106 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.253103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-kubelet-config\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.253214 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:11.253138 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret podName:edb5064a-5a9e-425a-8ba4-f78d68e1f2a8 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:11.753124478 +0000 UTC m=+39.554258157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret") pod "global-pull-secret-syncer-xs2h6" (UID: "edb5064a-5a9e-425a-8ba4-f78d68e1f2a8") : object "kube-system"/"original-pull-secret" not registered Apr 20 23:13:11.253214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.253158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-dbus\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.757207 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.757179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.757344 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:11.757276 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 23:13:11.757344 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:11.757326 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret podName:edb5064a-5a9e-425a-8ba4-f78d68e1f2a8 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:12.757312043 +0000 UTC m=+40.558445716 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret") pod "global-pull-secret-syncer-xs2h6" (UID: "edb5064a-5a9e-425a-8ba4-f78d68e1f2a8") : object "kube-system"/"original-pull-secret" not registered Apr 20 23:13:11.778135 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.778110 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:11.778222 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.778110 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:11.778268 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:11.778214 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:13:11.778321 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:11.778291 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:13:11.930874 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:11.930841 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:11.930985 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:11.930944 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xs2h6" podUID="edb5064a-5a9e-425a-8ba4-f78d68e1f2a8" Apr 20 23:13:12.762808 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:12.762771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:12.763294 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:12.762858 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 23:13:12.763294 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:12.762912 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret podName:edb5064a-5a9e-425a-8ba4-f78d68e1f2a8 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:14.762895733 +0000 UTC m=+42.564029406 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret") pod "global-pull-secret-syncer-xs2h6" (UID: "edb5064a-5a9e-425a-8ba4-f78d68e1f2a8") : object "kube-system"/"original-pull-secret" not registered Apr 20 23:13:13.778150 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:13.778118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:13.778150 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:13.778151 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:13.778646 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:13.778224 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xs2h6" podUID="edb5064a-5a9e-425a-8ba4-f78d68e1f2a8" Apr 20 23:13:13.778646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:13.778245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:13.778646 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:13.778314 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qklww" podUID="0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd" Apr 20 23:13:13.778646 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:13.778369 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9mzc" podUID="45a7ee7b-b744-4b77-bc49-38abb3429332" Apr 20 23:13:14.482303 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.482092 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-251.ec2.internal" event="NodeReady" Apr 20 23:13:14.482412 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.482338 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 23:13:14.517443 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.517417 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69bf6599f5-62ddf"] Apr 20 23:13:14.567809 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.567783 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8jxc9"] Apr 20 23:13:14.567952 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.567932 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.570387 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.570362 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xgs9r\"" Apr 20 23:13:14.570387 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.570373 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 23:13:14.570907 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.570710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 23:13:14.571008 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.570776 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 23:13:14.576839 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.576823 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 23:13:14.604981 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.604957 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-blz49"] Apr 20 23:13:14.605109 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.605094 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.607198 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.607180 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 23:13:14.607386 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.607373 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 23:13:14.607456 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.607429 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2fmt6\"" Apr 20 23:13:14.641066 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.641048 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69bf6599f5-62ddf"] Apr 20 23:13:14.641137 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.641071 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8jxc9"] Apr 20 23:13:14.641137 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.641085 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-blz49"] Apr 20 23:13:14.641137 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.641130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-blz49" Apr 20 23:13:14.643835 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.643819 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4dh2j\"" Apr 20 23:13:14.643931 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.643854 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 23:13:14.643931 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.643823 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 23:13:14.643931 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.643916 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 23:13:14.679257 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.679236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dc67f46b-0422-4208-bb95-3e6e2a33d87a-registry-certificates\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.679338 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.679261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc67f46b-0422-4208-bb95-3e6e2a33d87a-trusted-ca\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.679338 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.679279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dc67f46b-0422-4208-bb95-3e6e2a33d87a-installation-pull-secrets\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.679338 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.679302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dc67f46b-0422-4208-bb95-3e6e2a33d87a-ca-trust-extracted\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.679435 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.679349 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx2s9\" (UniqueName: \"kubernetes.io/projected/dc67f46b-0422-4208-bb95-3e6e2a33d87a-kube-api-access-vx2s9\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.679435 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.679381 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dc67f46b-0422-4208-bb95-3e6e2a33d87a-registry-tls\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.679435 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.679428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc67f46b-0422-4208-bb95-3e6e2a33d87a-bound-sa-token\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.679546 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.679446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dc67f46b-0422-4208-bb95-3e6e2a33d87a-image-registry-private-configuration\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.779677 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc67f46b-0422-4208-bb95-3e6e2a33d87a-trusted-ca\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.779677 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779658 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9ss\" (UniqueName: \"kubernetes.io/projected/90d78c4f-0d8b-4012-9508-a6f166ed7d86-kube-api-access-rr9ss\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779685 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fphxk\" (UniqueName: \"kubernetes.io/projected/f78adbde-3788-4728-8a4c-fb1195350fbe-kube-api-access-fphxk\") pod \"ingress-canary-blz49\" (UID: \"f78adbde-3788-4728-8a4c-fb1195350fbe\") " pod="openshift-ingress-canary/ingress-canary-blz49" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dc67f46b-0422-4208-bb95-3e6e2a33d87a-installation-pull-secrets\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dc67f46b-0422-4208-bb95-3e6e2a33d87a-ca-trust-extracted\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vx2s9\" (UniqueName: \"kubernetes.io/projected/dc67f46b-0422-4208-bb95-3e6e2a33d87a-kube-api-access-vx2s9\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dc67f46b-0422-4208-bb95-3e6e2a33d87a-registry-tls\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90d78c4f-0d8b-4012-9508-a6f166ed7d86-config-volume\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90d78c4f-0d8b-4012-9508-a6f166ed7d86-metrics-tls\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc67f46b-0422-4208-bb95-3e6e2a33d87a-bound-sa-token\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779961 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f78adbde-3788-4728-8a4c-fb1195350fbe-cert\") pod \"ingress-canary-blz49\" (UID: \"f78adbde-3788-4728-8a4c-fb1195350fbe\") " pod="openshift-ingress-canary/ingress-canary-blz49" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.779986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dc67f46b-0422-4208-bb95-3e6e2a33d87a-image-registry-private-configuration\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.780067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.780050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:14.780597 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.780077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/90d78c4f-0d8b-4012-9508-a6f166ed7d86-tmp-dir\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.780597 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.780111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dc67f46b-0422-4208-bb95-3e6e2a33d87a-registry-certificates\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.780597 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:14.780562 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 23:13:14.780597 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.780076 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dc67f46b-0422-4208-bb95-3e6e2a33d87a-ca-trust-extracted\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.780810 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:14.780626 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret podName:edb5064a-5a9e-425a-8ba4-f78d68e1f2a8 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:18.780608916 +0000 UTC m=+46.581742591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret") pod "global-pull-secret-syncer-xs2h6" (UID: "edb5064a-5a9e-425a-8ba4-f78d68e1f2a8") : object "kube-system"/"original-pull-secret" not registered Apr 20 23:13:14.780810 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.780683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dc67f46b-0422-4208-bb95-3e6e2a33d87a-trusted-ca\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.780927 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.780908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dc67f46b-0422-4208-bb95-3e6e2a33d87a-registry-certificates\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.784307 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.784287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dc67f46b-0422-4208-bb95-3e6e2a33d87a-registry-tls\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.784400 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.784296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dc67f46b-0422-4208-bb95-3e6e2a33d87a-installation-pull-secrets\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.784439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.784420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dc67f46b-0422-4208-bb95-3e6e2a33d87a-image-registry-private-configuration\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.787824 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.787801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dc67f46b-0422-4208-bb95-3e6e2a33d87a-bound-sa-token\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.787954 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.787939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx2s9\" (UniqueName: \"kubernetes.io/projected/dc67f46b-0422-4208-bb95-3e6e2a33d87a-kube-api-access-vx2s9\") pod \"image-registry-69bf6599f5-62ddf\" (UID: \"dc67f46b-0422-4208-bb95-3e6e2a33d87a\") " pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.878963 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.878939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:14.880629 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.880609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90d78c4f-0d8b-4012-9508-a6f166ed7d86-config-volume\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.881285 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.881267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90d78c4f-0d8b-4012-9508-a6f166ed7d86-metrics-tls\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.881410 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.881397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f78adbde-3788-4728-8a4c-fb1195350fbe-cert\") pod \"ingress-canary-blz49\" (UID: \"f78adbde-3788-4728-8a4c-fb1195350fbe\") " pod="openshift-ingress-canary/ingress-canary-blz49" Apr 20 23:13:14.881568 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.881554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/90d78c4f-0d8b-4012-9508-a6f166ed7d86-tmp-dir\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.881660 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.881648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9ss\" (UniqueName: \"kubernetes.io/projected/90d78c4f-0d8b-4012-9508-a6f166ed7d86-kube-api-access-rr9ss\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.881748 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.881737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fphxk\" (UniqueName: \"kubernetes.io/projected/f78adbde-3788-4728-8a4c-fb1195350fbe-kube-api-access-fphxk\") pod \"ingress-canary-blz49\" (UID: \"f78adbde-3788-4728-8a4c-fb1195350fbe\") " pod="openshift-ingress-canary/ingress-canary-blz49" Apr 20 23:13:14.882059 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.881230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90d78c4f-0d8b-4012-9508-a6f166ed7d86-config-volume\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.882985 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.882961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/90d78c4f-0d8b-4012-9508-a6f166ed7d86-tmp-dir\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.884310 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.884287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/90d78c4f-0d8b-4012-9508-a6f166ed7d86-metrics-tls\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.885071 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.885050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f78adbde-3788-4728-8a4c-fb1195350fbe-cert\") pod \"ingress-canary-blz49\" (UID: \"f78adbde-3788-4728-8a4c-fb1195350fbe\") " pod="openshift-ingress-canary/ingress-canary-blz49" Apr 20 23:13:14.890209 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.890190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9ss\" (UniqueName: \"kubernetes.io/projected/90d78c4f-0d8b-4012-9508-a6f166ed7d86-kube-api-access-rr9ss\") pod \"dns-default-8jxc9\" (UID: \"90d78c4f-0d8b-4012-9508-a6f166ed7d86\") " pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.890314 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.890299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fphxk\" (UniqueName: \"kubernetes.io/projected/f78adbde-3788-4728-8a4c-fb1195350fbe-kube-api-access-fphxk\") pod \"ingress-canary-blz49\" (UID: \"f78adbde-3788-4728-8a4c-fb1195350fbe\") " pod="openshift-ingress-canary/ingress-canary-blz49" Apr 20 23:13:14.913172 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.913152 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:14.949058 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:14.949038 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-blz49" Apr 20 23:13:15.094058 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.094022 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-blz49"] Apr 20 23:13:15.097235 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.097210 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8jxc9"] Apr 20 23:13:15.100255 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.100232 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69bf6599f5-62ddf"] Apr 20 23:13:15.102990 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:15.102965 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78adbde_3788_4728_8a4c_fb1195350fbe.slice/crio-dc3788a598c4232da901e935ac1bdecd425359516201c45abf333c1d842833f3 WatchSource:0}: Error finding container dc3788a598c4232da901e935ac1bdecd425359516201c45abf333c1d842833f3: Status 404 returned error can't find the container with id dc3788a598c4232da901e935ac1bdecd425359516201c45abf333c1d842833f3 Apr 20 23:13:15.103615 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:15.103567 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90d78c4f_0d8b_4012_9508_a6f166ed7d86.slice/crio-825fa47806a174d54a09f1c6a6c116bf5022d9f06a516978b77be8a15190fc90 WatchSource:0}: Error finding container 825fa47806a174d54a09f1c6a6c116bf5022d9f06a516978b77be8a15190fc90: Status 404 returned error can't find the container with id 825fa47806a174d54a09f1c6a6c116bf5022d9f06a516978b77be8a15190fc90 Apr 20 23:13:15.104606 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:15.104564 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc67f46b_0422_4208_bb95_3e6e2a33d87a.slice/crio-9d81a3d144e5c6d231a0325f20b5b1dc60503add49f694cd6f339fe052fc782f WatchSource:0}: Error finding container 9d81a3d144e5c6d231a0325f20b5b1dc60503add49f694cd6f339fe052fc782f: Status 404 returned error can't find the container with id 9d81a3d144e5c6d231a0325f20b5b1dc60503add49f694cd6f339fe052fc782f Apr 20 23:13:15.778485 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.778267 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:15.778675 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.778267 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:15.778998 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.778323 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:15.782408 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.782385 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 23:13:15.782860 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.782488 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 23:13:15.782860 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.782609 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 23:13:15.783081 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.783032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sk2w6\"" Apr 20 23:13:15.783226 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.783207 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g6bnb\"" Apr 20 23:13:15.783226 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.783223 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 23:13:15.939749 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.939711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-blz49" event={"ID":"f78adbde-3788-4728-8a4c-fb1195350fbe","Type":"ContainerStarted","Data":"dc3788a598c4232da901e935ac1bdecd425359516201c45abf333c1d842833f3"} Apr 20 23:13:15.941573 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.941543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" event={"ID":"dc67f46b-0422-4208-bb95-3e6e2a33d87a","Type":"ContainerStarted","Data":"8e3a535e1bed335194d6b9edf1b320d13f53c860a1f44a31cb38bb70b6b0f21a"} Apr 20 23:13:15.941707 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.941582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" event={"ID":"dc67f46b-0422-4208-bb95-3e6e2a33d87a","Type":"ContainerStarted","Data":"9d81a3d144e5c6d231a0325f20b5b1dc60503add49f694cd6f339fe052fc782f"} Apr 20 23:13:15.941707 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.941664 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:15.943209 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.943178 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8jxc9" event={"ID":"90d78c4f-0d8b-4012-9508-a6f166ed7d86","Type":"ContainerStarted","Data":"825fa47806a174d54a09f1c6a6c116bf5022d9f06a516978b77be8a15190fc90"} Apr 20 23:13:15.961265 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:15.961214 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" podStartSLOduration=7.9611975170000004 podStartE2EDuration="7.961197517s" podCreationTimestamp="2026-04-20 23:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:13:15.960252948 +0000 UTC m=+43.761386647" watchObservedRunningTime="2026-04-20 23:13:15.961197517 +0000 UTC m=+43.762331212" Apr 20 23:13:17.949851 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:17.949122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-blz49" event={"ID":"f78adbde-3788-4728-8a4c-fb1195350fbe","Type":"ContainerStarted","Data":"ac3d8f0c4509c214449a8f2ebf2ed526de14fa71c7dd1a72d4d27ebd388c6186"} Apr 20 23:13:17.955113 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:17.955088 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8jxc9" event={"ID":"90d78c4f-0d8b-4012-9508-a6f166ed7d86","Type":"ContainerStarted","Data":"c572c0190de51b6ff69e204c943225e45406ec8795ccc50eb5d98fa7c0a9a869"} Apr 20 23:13:17.955113 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:17.955118 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8jxc9" event={"ID":"90d78c4f-0d8b-4012-9508-a6f166ed7d86","Type":"ContainerStarted","Data":"49c11a9534d24e5cd585d361db257ea8da24af383790b45f5c178c705574465a"} Apr 20 23:13:17.955358 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:17.955339 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:17.965580 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:17.965545 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-blz49" podStartSLOduration=1.571631274 podStartE2EDuration="3.965534261s" podCreationTimestamp="2026-04-20 23:13:14 +0000 UTC" firstStartedPulling="2026-04-20 23:13:15.104949636 +0000 UTC m=+42.906083312" lastFinishedPulling="2026-04-20 23:13:17.498852624 +0000 UTC m=+45.299986299" observedRunningTime="2026-04-20 23:13:17.965363351 +0000 UTC m=+45.766497046" watchObservedRunningTime="2026-04-20 23:13:17.965534261 +0000 UTC m=+45.766667956" Apr 20 23:13:17.985717 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:17.985668 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8jxc9" podStartSLOduration=1.597120126 podStartE2EDuration="3.985655497s" podCreationTimestamp="2026-04-20 23:13:14 +0000 UTC" firstStartedPulling="2026-04-20 23:13:15.105744964 +0000 UTC m=+42.906878650" lastFinishedPulling="2026-04-20 23:13:17.494280341 +0000 UTC m=+45.295414021" observedRunningTime="2026-04-20 23:13:17.984945212 +0000 UTC m=+45.786078909" watchObservedRunningTime="2026-04-20 23:13:17.985655497 +0000 UTC m=+45.786789193" Apr 20 23:13:18.812110 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:18.812078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:18.815480 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:18.815440 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/edb5064a-5a9e-425a-8ba4-f78d68e1f2a8-original-pull-secret\") pod \"global-pull-secret-syncer-xs2h6\" (UID: \"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8\") " pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:19.113378 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.113297 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xs2h6" Apr 20 23:13:19.250277 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.250228 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xs2h6"] Apr 20 23:13:19.254391 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:19.254366 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb5064a_5a9e_425a_8ba4_f78d68e1f2a8.slice/crio-b84e116771d3346842e896b10c63c69d054d19735c5e7560dbd9ebfd59116b44 WatchSource:0}: Error finding container b84e116771d3346842e896b10c63c69d054d19735c5e7560dbd9ebfd59116b44: Status 404 returned error can't find the container with id b84e116771d3346842e896b10c63c69d054d19735c5e7560dbd9ebfd59116b44 Apr 20 23:13:19.631426 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.631399 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn"] Apr 20 23:13:19.651755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.651729 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn"] Apr 20 23:13:19.651865 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.651764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" Apr 20 23:13:19.653931 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.653913 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 23:13:19.654041 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.654026 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-tt2gf\"" Apr 20 23:13:19.719368 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.719329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/92035fd5-ad3c-409b-8454-202c7e10d36a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jjlrn\" (UID: \"92035fd5-ad3c-409b-8454-202c7e10d36a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" Apr 20 23:13:19.819953 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.819920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/92035fd5-ad3c-409b-8454-202c7e10d36a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jjlrn\" (UID: \"92035fd5-ad3c-409b-8454-202c7e10d36a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" Apr 20 23:13:19.820113 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:19.820033 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 20 23:13:19.820113 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:19.820101 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92035fd5-ad3c-409b-8454-202c7e10d36a-tls-certificates podName:92035fd5-ad3c-409b-8454-202c7e10d36a nodeName:}" failed. No retries permitted until 2026-04-20 23:13:20.320080797 +0000 UTC m=+48.121214474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/92035fd5-ad3c-409b-8454-202c7e10d36a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-jjlrn" (UID: "92035fd5-ad3c-409b-8454-202c7e10d36a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 20 23:13:19.959921 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.959844 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xs2h6" event={"ID":"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8","Type":"ContainerStarted","Data":"b84e116771d3346842e896b10c63c69d054d19735c5e7560dbd9ebfd59116b44"} Apr 20 23:13:19.972052 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.972027 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-m5xhr"] Apr 20 23:13:19.993339 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.993310 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m5xhr"] Apr 20 23:13:19.993477 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.993448 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m5xhr" Apr 20 23:13:19.996361 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.996338 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-ftnp6\"" Apr 20 23:13:19.996629 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.996607 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 23:13:19.996810 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:19.996610 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 23:13:20.020955 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.020929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpccm\" (UniqueName: \"kubernetes.io/projected/b412400c-7349-4590-ad92-575f2cb10591-kube-api-access-hpccm\") pod \"downloads-6bcc868b7-m5xhr\" (UID: \"b412400c-7349-4590-ad92-575f2cb10591\") " pod="openshift-console/downloads-6bcc868b7-m5xhr" Apr 20 23:13:20.121956 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.121929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpccm\" (UniqueName: \"kubernetes.io/projected/b412400c-7349-4590-ad92-575f2cb10591-kube-api-access-hpccm\") pod \"downloads-6bcc868b7-m5xhr\" (UID: \"b412400c-7349-4590-ad92-575f2cb10591\") " pod="openshift-console/downloads-6bcc868b7-m5xhr" Apr 20 23:13:20.137950 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.137922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpccm\" (UniqueName: \"kubernetes.io/projected/b412400c-7349-4590-ad92-575f2cb10591-kube-api-access-hpccm\") pod \"downloads-6bcc868b7-m5xhr\" (UID: \"b412400c-7349-4590-ad92-575f2cb10591\") " pod="openshift-console/downloads-6bcc868b7-m5xhr" Apr 20 23:13:20.304070 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.303997 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m5xhr" Apr 20 23:13:20.323254 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.323225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/92035fd5-ad3c-409b-8454-202c7e10d36a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jjlrn\" (UID: \"92035fd5-ad3c-409b-8454-202c7e10d36a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" Apr 20 23:13:20.326176 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.326153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/92035fd5-ad3c-409b-8454-202c7e10d36a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-jjlrn\" (UID: \"92035fd5-ad3c-409b-8454-202c7e10d36a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" Apr 20 23:13:20.433835 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.433806 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m5xhr"] Apr 20 23:13:20.437347 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:20.437311 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb412400c_7349_4590_ad92_575f2cb10591.slice/crio-91c32428d2cbe5dd41c61ad72471775ca1d8d43e2785249485d34be578286137 WatchSource:0}: Error finding container 91c32428d2cbe5dd41c61ad72471775ca1d8d43e2785249485d34be578286137: Status 404 returned error can't find the container with id 91c32428d2cbe5dd41c61ad72471775ca1d8d43e2785249485d34be578286137 Apr 20 23:13:20.560798 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.560777 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" Apr 20 23:13:20.697276 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.694511 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn"] Apr 20 23:13:20.699084 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:20.699053 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92035fd5_ad3c_409b_8454_202c7e10d36a.slice/crio-3b23a7d5c307289c7a55f79a9d64c2c82ad0aaa0ef31f3c5926c037270657aa7 WatchSource:0}: Error finding container 3b23a7d5c307289c7a55f79a9d64c2c82ad0aaa0ef31f3c5926c037270657aa7: Status 404 returned error can't find the container with id 3b23a7d5c307289c7a55f79a9d64c2c82ad0aaa0ef31f3c5926c037270657aa7 Apr 20 23:13:20.702998 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.702968 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8jxc9_90d78c4f-0d8b-4012-9508-a6f166ed7d86/dns/0.log" Apr 20 23:13:20.881832 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.881756 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8jxc9_90d78c4f-0d8b-4012-9508-a6f166ed7d86/kube-rbac-proxy/0.log" Apr 20 23:13:20.962436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.962396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m5xhr" event={"ID":"b412400c-7349-4590-ad92-575f2cb10591","Type":"ContainerStarted","Data":"91c32428d2cbe5dd41c61ad72471775ca1d8d43e2785249485d34be578286137"} Apr 20 23:13:20.963343 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:20.963321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" event={"ID":"92035fd5-ad3c-409b-8454-202c7e10d36a","Type":"ContainerStarted","Data":"3b23a7d5c307289c7a55f79a9d64c2c82ad0aaa0ef31f3c5926c037270657aa7"} Apr 20 23:13:21.682223 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:21.682193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w7nck_7f44fca8-2437-4e45-8b93-eac1d3f54370/dns-node-resolver/0.log" Apr 20 23:13:22.082140 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.082106 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-69bf6599f5-62ddf_dc67f46b-0422-4208-bb95-3e6e2a33d87a/registry/0.log" Apr 20 23:13:22.321880 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.321837 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-7tmld"] Apr 20 23:13:22.335130 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.335054 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7tmld"] Apr 20 23:13:22.335319 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.335198 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.337794 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.337771 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 23:13:22.337905 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.337837 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-clbx8\"" Apr 20 23:13:22.337905 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.337872 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 23:13:22.338250 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.338228 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 23:13:22.338866 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.338831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e5b2202d-031f-4096-8113-72bf2ee199f4-crio-socket\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.338971 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.338874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5b2202d-031f-4096-8113-72bf2ee199f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.338971 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.338943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5b2202d-031f-4096-8113-72bf2ee199f4-data-volume\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.339079 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.338973 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e5b2202d-031f-4096-8113-72bf2ee199f4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.339079 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.339019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88mn\" (UniqueName: \"kubernetes.io/projected/e5b2202d-031f-4096-8113-72bf2ee199f4-kube-api-access-b88mn\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.339265 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.339238 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 23:13:22.439869 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.439838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5b2202d-031f-4096-8113-72bf2ee199f4-data-volume\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.440015 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.439884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e5b2202d-031f-4096-8113-72bf2ee199f4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.440015 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.439927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b88mn\" (UniqueName: \"kubernetes.io/projected/e5b2202d-031f-4096-8113-72bf2ee199f4-kube-api-access-b88mn\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.440015 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.439987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e5b2202d-031f-4096-8113-72bf2ee199f4-crio-socket\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.440176 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.440016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5b2202d-031f-4096-8113-72bf2ee199f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.440176 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:22.440113 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 23:13:22.440176 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:22.440173 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5b2202d-031f-4096-8113-72bf2ee199f4-insights-runtime-extractor-tls podName:e5b2202d-031f-4096-8113-72bf2ee199f4 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:22.940155612 +0000 UTC m=+50.741289289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e5b2202d-031f-4096-8113-72bf2ee199f4-insights-runtime-extractor-tls") pod "insights-runtime-extractor-7tmld" (UID: "e5b2202d-031f-4096-8113-72bf2ee199f4") : secret "insights-runtime-extractor-tls" not found Apr 20 23:13:22.440318 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.440173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5b2202d-031f-4096-8113-72bf2ee199f4-data-volume\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.440318 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.440226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e5b2202d-031f-4096-8113-72bf2ee199f4-crio-socket\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.440527 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.440497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e5b2202d-031f-4096-8113-72bf2ee199f4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.456016 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.455994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88mn\" (UniqueName: \"kubernetes.io/projected/e5b2202d-031f-4096-8113-72bf2ee199f4-kube-api-access-b88mn\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.681377 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.681307 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h9hnp_f0d58ae6-6867-491f-888f-03272f7c80e7/node-ca/0.log" Apr 20 23:13:22.943724 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.943617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5b2202d-031f-4096-8113-72bf2ee199f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.947233 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.947055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5b2202d-031f-4096-8113-72bf2ee199f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-7tmld\" (UID: \"e5b2202d-031f-4096-8113-72bf2ee199f4\") " pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:22.947367 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:22.947346 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-7tmld" Apr 20 23:13:23.481852 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:23.481822 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-blz49_f78adbde-3788-4728-8a4c-fb1195350fbe/serve-healthcheck-canary/0.log" Apr 20 23:13:23.969534 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:23.969295 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-7tmld"] Apr 20 23:13:23.971129 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:23.970993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" event={"ID":"92035fd5-ad3c-409b-8454-202c7e10d36a","Type":"ContainerStarted","Data":"b9250a34786b4a425ebb6a94d6aaee302300ea8b641c9b11bf1bd58654dabfc9"} Apr 20 23:13:23.971629 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:23.971593 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" Apr 20 23:13:23.975749 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:23.975725 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5b2202d_031f_4096_8113_72bf2ee199f4.slice/crio-20cf6dd94929701e1886e0ee8919686bf86482a7f33d4b84c8c8853864627670 WatchSource:0}: Error finding container 20cf6dd94929701e1886e0ee8919686bf86482a7f33d4b84c8c8853864627670: Status 404 returned error can't find the container with id 20cf6dd94929701e1886e0ee8919686bf86482a7f33d4b84c8c8853864627670 Apr 20 23:13:23.976706 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:23.976588 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" Apr 20 23:13:23.987043 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:23.987000 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-jjlrn" podStartSLOduration=1.8606561799999999 podStartE2EDuration="4.986948462s" podCreationTimestamp="2026-04-20 23:13:19 +0000 UTC" firstStartedPulling="2026-04-20 23:13:20.701067568 +0000 UTC m=+48.502201245" lastFinishedPulling="2026-04-20 23:13:23.82735984 +0000 UTC m=+51.628493527" observedRunningTime="2026-04-20 23:13:23.986011702 +0000 UTC m=+51.787145399" watchObservedRunningTime="2026-04-20 23:13:23.986948462 +0000 UTC m=+51.788082160" Apr 20 23:13:24.719717 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.719685 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rfkh4"] Apr 20 23:13:24.722799 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.722769 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.725679 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.725551 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 23:13:24.726576 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.726539 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 23:13:24.726693 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.726585 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 23:13:24.726693 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.726539 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-nhgmf\"" Apr 20 23:13:24.726893 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.726874 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 23:13:24.727080 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.727063 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 23:13:24.731685 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.731666 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rfkh4"] Apr 20 23:13:24.758245 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.758219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78b55960-2f46-41ab-a33d-7e9dd549a98c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.758370 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.758260 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78b55960-2f46-41ab-a33d-7e9dd549a98c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.758370 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.758288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg6mb\" (UniqueName: \"kubernetes.io/projected/78b55960-2f46-41ab-a33d-7e9dd549a98c-kube-api-access-vg6mb\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.758514 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.758394 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b55960-2f46-41ab-a33d-7e9dd549a98c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.859574 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.859546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b55960-2f46-41ab-a33d-7e9dd549a98c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.859732 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.859608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78b55960-2f46-41ab-a33d-7e9dd549a98c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.859732 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.859635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78b55960-2f46-41ab-a33d-7e9dd549a98c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.859732 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:24.859707 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 23:13:24.859862 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:24.859774 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78b55960-2f46-41ab-a33d-7e9dd549a98c-prometheus-operator-tls podName:78b55960-2f46-41ab-a33d-7e9dd549a98c nodeName:}" failed. No retries permitted until 2026-04-20 23:13:25.359753755 +0000 UTC m=+53.160887438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/78b55960-2f46-41ab-a33d-7e9dd549a98c-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-rfkh4" (UID: "78b55960-2f46-41ab-a33d-7e9dd549a98c") : secret "prometheus-operator-tls" not found Apr 20 23:13:24.859862 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.859809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vg6mb\" (UniqueName: \"kubernetes.io/projected/78b55960-2f46-41ab-a33d-7e9dd549a98c-kube-api-access-vg6mb\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.860378 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.860347 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78b55960-2f46-41ab-a33d-7e9dd549a98c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.863869 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.863845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78b55960-2f46-41ab-a33d-7e9dd549a98c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.872620 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.872598 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg6mb\" (UniqueName: \"kubernetes.io/projected/78b55960-2f46-41ab-a33d-7e9dd549a98c-kube-api-access-vg6mb\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:24.977676 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.977594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7tmld" event={"ID":"e5b2202d-031f-4096-8113-72bf2ee199f4","Type":"ContainerStarted","Data":"598b9c59132e818c2ca5714ada55bf9ae8c49c7fd22a0d98a0caea96a129fb15"} Apr 20 23:13:24.977676 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.977650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7tmld" event={"ID":"e5b2202d-031f-4096-8113-72bf2ee199f4","Type":"ContainerStarted","Data":"20cf6dd94929701e1886e0ee8919686bf86482a7f33d4b84c8c8853864627670"} Apr 20 23:13:24.979284 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.979242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xs2h6" event={"ID":"edb5064a-5a9e-425a-8ba4-f78d68e1f2a8","Type":"ContainerStarted","Data":"6ff1312fbef35ac5e5a043cdd727f67298f32a8bf08c3516f760176dab123ec7"} Apr 20 23:13:24.996745 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:24.996707 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xs2h6" podStartSLOduration=10.408184944 podStartE2EDuration="14.99669553s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:19.256510645 +0000 UTC m=+47.057644319" lastFinishedPulling="2026-04-20 23:13:23.845021225 +0000 UTC m=+51.646154905" observedRunningTime="2026-04-20 23:13:24.996436043 +0000 UTC m=+52.797569739" watchObservedRunningTime="2026-04-20 23:13:24.99669553 +0000 UTC m=+52.797829220" Apr 20 23:13:25.363396 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:25.363357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b55960-2f46-41ab-a33d-7e9dd549a98c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:25.366914 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:25.366887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b55960-2f46-41ab-a33d-7e9dd549a98c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rfkh4\" (UID: \"78b55960-2f46-41ab-a33d-7e9dd549a98c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:25.637159 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:25.637081 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" Apr 20 23:13:25.785434 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:25.785402 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rfkh4"] Apr 20 23:13:25.789807 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:25.789768 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b55960_2f46_41ab_a33d_7e9dd549a98c.slice/crio-d3902558cc2cdaa5e374022adcba96b4d9c993e4b6e81da7a9905e8fbb749319 WatchSource:0}: Error finding container d3902558cc2cdaa5e374022adcba96b4d9c993e4b6e81da7a9905e8fbb749319: Status 404 returned error can't find the container with id d3902558cc2cdaa5e374022adcba96b4d9c993e4b6e81da7a9905e8fbb749319 Apr 20 23:13:25.984164 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:25.984079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7tmld" event={"ID":"e5b2202d-031f-4096-8113-72bf2ee199f4","Type":"ContainerStarted","Data":"df18ded87d47512ebc0b988d3dc29a23c5f8c6e72267f5f110a8b69d518adcca"} Apr 20 23:13:25.986253 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:25.986221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" event={"ID":"78b55960-2f46-41ab-a33d-7e9dd549a98c","Type":"ContainerStarted","Data":"d3902558cc2cdaa5e374022adcba96b4d9c993e4b6e81da7a9905e8fbb749319"} Apr 20 23:13:26.991439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:26.991219 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-7tmld" event={"ID":"e5b2202d-031f-4096-8113-72bf2ee199f4","Type":"ContainerStarted","Data":"55498720016c3db106e833c7fe8a9852d367eefabce0bcc32a0450538e30b1a6"} Apr 20 23:13:27.014411 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:27.014361 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-7tmld" podStartSLOduration=2.599692561 podStartE2EDuration="5.014345422s" podCreationTimestamp="2026-04-20 23:13:22 +0000 UTC" firstStartedPulling="2026-04-20 23:13:24.032738486 +0000 UTC m=+51.833872160" lastFinishedPulling="2026-04-20 23:13:26.447391342 +0000 UTC m=+54.248525021" observedRunningTime="2026-04-20 23:13:27.012439893 +0000 UTC m=+54.813573590" watchObservedRunningTime="2026-04-20 23:13:27.014345422 +0000 UTC m=+54.815479119" Apr 20 23:13:27.959646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:27.959614 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8jxc9" Apr 20 23:13:27.996829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:27.996788 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" event={"ID":"78b55960-2f46-41ab-a33d-7e9dd549a98c","Type":"ContainerStarted","Data":"4823979dd868494fec0d503b25bce5bdcd626bb24666d59bc70b4144a1013078"} Apr 20 23:13:27.996829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:27.996831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" event={"ID":"78b55960-2f46-41ab-a33d-7e9dd549a98c","Type":"ContainerStarted","Data":"6cf910bd69f3fc6da1e2f4131acede092f0f75f8f93ca63dd95cfd69625b806d"} Apr 20 23:13:28.015860 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:28.015817 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-rfkh4" podStartSLOduration=2.445873496 podStartE2EDuration="4.015799372s" podCreationTimestamp="2026-04-20 23:13:24 +0000 UTC" firstStartedPulling="2026-04-20 23:13:25.792592618 +0000 UTC m=+53.593726296" lastFinishedPulling="2026-04-20 23:13:27.362518485 +0000 UTC m=+55.163652172" observedRunningTime="2026-04-20 23:13:28.014870623 +0000 UTC m=+55.816004319" watchObservedRunningTime="2026-04-20 23:13:28.015799372 +0000 UTC m=+55.816933069" Apr 20 23:13:29.312119 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.312086 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-768777c665-tt2bp"] Apr 20 23:13:29.348664 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.348630 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-768777c665-tt2bp"] Apr 20 23:13:29.348809 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.348751 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.351810 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.351780 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 23:13:29.352873 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.352847 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 23:13:29.352989 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.352910 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 23:13:29.352989 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.352960 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 23:13:29.353100 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.352910 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7n6dv\"" Apr 20 23:13:29.353152 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.353127 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 23:13:29.392844 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.392817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npxbd\" (UniqueName: \"kubernetes.io/projected/722b5e53-8135-4017-925f-fc1051d33783-kube-api-access-npxbd\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.392964 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.392859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-oauth-serving-cert\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.392964 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.392876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-oauth-config\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.392964 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.392892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-console-config\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.393075 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.393029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-service-ca\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.393108 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.393088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-serving-cert\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.494211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.494176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-service-ca\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.494360 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.494237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-serving-cert\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.494360 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.494258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npxbd\" (UniqueName: \"kubernetes.io/projected/722b5e53-8135-4017-925f-fc1051d33783-kube-api-access-npxbd\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.494360 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.494280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-oauth-serving-cert\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.494360 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.494298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-oauth-config\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.494360 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.494326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-console-config\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.494955 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.494931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-service-ca\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.495056 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.495026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-oauth-serving-cert\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.495116 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.495095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-console-config\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.497942 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.497923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-oauth-config\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.498198 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.498181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-serving-cert\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.508330 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.508311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npxbd\" (UniqueName: \"kubernetes.io/projected/722b5e53-8135-4017-925f-fc1051d33783-kube-api-access-npxbd\") pod \"console-768777c665-tt2bp\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.659967 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.659892 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:29.791626 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:29.791589 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-768777c665-tt2bp"] Apr 20 23:13:29.795285 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:29.795255 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod722b5e53_8135_4017_925f_fc1051d33783.slice/crio-bde1b37c250ffbf1117a2307fc8b2a2444ccb51cbaf854035b5309b070efc1f8 WatchSource:0}: Error finding container bde1b37c250ffbf1117a2307fc8b2a2444ccb51cbaf854035b5309b070efc1f8: Status 404 returned error can't find the container with id bde1b37c250ffbf1117a2307fc8b2a2444ccb51cbaf854035b5309b070efc1f8 Apr 20 23:13:30.004182 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.004100 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-768777c665-tt2bp" event={"ID":"722b5e53-8135-4017-925f-fc1051d33783","Type":"ContainerStarted","Data":"bde1b37c250ffbf1117a2307fc8b2a2444ccb51cbaf854035b5309b070efc1f8"} Apr 20 23:13:30.064480 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.064438 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc"] Apr 20 23:13:30.077786 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.077764 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8fn88"] Apr 20 23:13:30.077948 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.077931 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.080087 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.080035 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 23:13:30.080209 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.080092 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-n99bl\"" Apr 20 23:13:30.080276 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.080220 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 23:13:30.094383 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.094362 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7vzql"] Apr 20 23:13:30.094544 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.094529 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.098097 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.098073 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 23:13:30.098097 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.098086 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kdhph\"" Apr 20 23:13:30.100658 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.098525 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 23:13:30.100658 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.098552 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 23:13:30.100658 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.099648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/77759cea-016e-42bd-9b8e-28f3fe6613aa-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.100658 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.099701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxzsm\" (UniqueName: \"kubernetes.io/projected/77759cea-016e-42bd-9b8e-28f3fe6613aa-kube-api-access-bxzsm\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.100658 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.099759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/77759cea-016e-42bd-9b8e-28f3fe6613aa-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.100658 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.099836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77759cea-016e-42bd-9b8e-28f3fe6613aa-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.119772 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.119700 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc"] Apr 20 23:13:30.119877 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.119787 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7vzql"] Apr 20 23:13:30.119877 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.119831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.122140 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.122120 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 23:13:30.122237 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.122193 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 23:13:30.122237 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.122212 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 23:13:30.122237 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.122220 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-d6gx2\"" Apr 20 23:13:30.201300 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxzsm\" (UniqueName: \"kubernetes.io/projected/77759cea-016e-42bd-9b8e-28f3fe6613aa-kube-api-access-bxzsm\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.201441 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qd5\" (UniqueName: \"kubernetes.io/projected/23e9fc71-0f77-4c32-a564-7860aee3bd59-kube-api-access-g5qd5\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.201441 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201343 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.201441 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23e9fc71-0f77-4c32-a564-7860aee3bd59-sys\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.201441 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201387 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-accelerators-collector-config\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.201662 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.201662 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.201662 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.201662 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/77759cea-016e-42bd-9b8e-28f3fe6613aa-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.201662 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.201821 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72kjx\" (UniqueName: \"kubernetes.io/projected/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-api-access-72kjx\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.201821 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-textfile\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.201821 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23e9fc71-0f77-4c32-a564-7860aee3bd59-metrics-client-ca\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.201821 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201760 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77759cea-016e-42bd-9b8e-28f3fe6613aa-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.201821 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201812 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-tls\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.202063 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201843 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/23e9fc71-0f77-4c32-a564-7860aee3bd59-root\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.202063 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.202063 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/77759cea-016e-42bd-9b8e-28f3fe6613aa-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.202063 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.201909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-wtmp\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.202063 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:30.202029 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 23:13:30.202292 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:30.202092 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77759cea-016e-42bd-9b8e-28f3fe6613aa-openshift-state-metrics-tls podName:77759cea-016e-42bd-9b8e-28f3fe6613aa nodeName:}" failed. No retries permitted until 2026-04-20 23:13:30.702071692 +0000 UTC m=+58.503205391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/77759cea-016e-42bd-9b8e-28f3fe6613aa-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-79qxc" (UID: "77759cea-016e-42bd-9b8e-28f3fe6613aa") : secret "openshift-state-metrics-tls" not found Apr 20 23:13:30.202403 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.202383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77759cea-016e-42bd-9b8e-28f3fe6613aa-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.204110 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.204091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/77759cea-016e-42bd-9b8e-28f3fe6613aa-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.214834 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.214789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxzsm\" (UniqueName: \"kubernetes.io/projected/77759cea-016e-42bd-9b8e-28f3fe6613aa-kube-api-access-bxzsm\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72kjx\" (UniqueName: \"kubernetes.io/projected/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-api-access-72kjx\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-textfile\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23e9fc71-0f77-4c32-a564-7860aee3bd59-metrics-client-ca\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-tls\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303444 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/23e9fc71-0f77-4c32-a564-7860aee3bd59-root\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-wtmp\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qd5\" (UniqueName: \"kubernetes.io/projected/23e9fc71-0f77-4c32-a564-7860aee3bd59-kube-api-access-g5qd5\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23e9fc71-0f77-4c32-a564-7860aee3bd59-sys\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-accelerators-collector-config\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.303818 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-wtmp\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.304688 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.303908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-textfile\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.304688 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.304162 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/23e9fc71-0f77-4c32-a564-7860aee3bd59-root\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.304688 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.304176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.304688 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.304292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/23e9fc71-0f77-4c32-a564-7860aee3bd59-metrics-client-ca\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.304688 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.304387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/23e9fc71-0f77-4c32-a564-7860aee3bd59-sys\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.305061 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.304961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-accelerators-collector-config\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.305061 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.305038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.305502 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.305445 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.306639 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.306616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.307233 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.307205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.307860 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.307817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/23e9fc71-0f77-4c32-a564-7860aee3bd59-node-exporter-tls\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.308174 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.308147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.314693 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.314563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qd5\" (UniqueName: \"kubernetes.io/projected/23e9fc71-0f77-4c32-a564-7860aee3bd59-kube-api-access-g5qd5\") pod \"node-exporter-8fn88\" (UID: \"23e9fc71-0f77-4c32-a564-7860aee3bd59\") " pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.325429 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.325387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72kjx\" (UniqueName: \"kubernetes.io/projected/b62cafb7-0ff7-4abf-ad06-094bdd2b3e31-kube-api-access-72kjx\") pod \"kube-state-metrics-69db897b98-7vzql\" (UID: \"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.407265 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.407225 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8fn88" Apr 20 23:13:30.430090 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.430064 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" Apr 20 23:13:30.706388 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.706342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/77759cea-016e-42bd-9b8e-28f3fe6613aa-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.709345 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.709318 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/77759cea-016e-42bd-9b8e-28f3fe6613aa-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-79qxc\" (UID: \"77759cea-016e-42bd-9b8e-28f3fe6613aa\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:30.989091 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:30.989011 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" Apr 20 23:13:31.172113 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.172078 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 23:13:31.176521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.176494 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.179131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.178583 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 23:13:31.179131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.178658 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 23:13:31.179131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.178800 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 23:13:31.179131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.178888 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 23:13:31.179131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.178968 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 23:13:31.179131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.179061 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 23:13:31.179647 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.179247 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 23:13:31.179647 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.179523 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 23:13:31.179647 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.179557 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8hhs4\"" Apr 20 23:13:31.180229 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.180211 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 23:13:31.188723 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.188679 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 23:13:31.210807 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.210783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28w5h\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-kube-api-access-28w5h\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.210936 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.210825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.210936 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.210851 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.210936 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.210877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.210936 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.210905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.211147 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.210961 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-out\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.211147 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.211024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.211147 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.211076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.211147 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.211100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.211339 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.211156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-web-config\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.211339 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.211190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.211339 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.211238 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-volume\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.211339 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.211266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.311703 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.311675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.311837 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.311714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-volume\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.311837 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.311743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.311837 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.311803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28w5h\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-kube-api-access-28w5h\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.311971 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.311843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.311971 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.311869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.311971 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.311893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.311971 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.311930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.312107 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:31.311985 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-trusted-ca-bundle podName:ada96a23-00e6-4a4d-81d1-bf42436e01d8 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:31.811961537 +0000 UTC m=+59.613095230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8") : configmap references non-existent config key: ca-bundle.crt Apr 20 23:13:31.312107 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.312039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-out\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.312107 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.312077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.312261 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.312124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.312261 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.312151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.312261 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.312196 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-web-config\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.313427 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.312998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.314244 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.314198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.315016 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.314988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-volume\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.315366 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.315170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.315433 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.315376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.315805 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.315779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-web-config\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.316189 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.316165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.316637 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.316582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.317003 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.316962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.317384 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.317362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-out\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.317677 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.317657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.324623 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.324590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28w5h\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-kube-api-access-28w5h\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.816541 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.816509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.817346 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.817326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:31.914772 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:31.914745 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lkbxg" Apr 20 23:13:32.090159 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:32.090080 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:13:33.181426 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.181391 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7dc9f489cd-29ctc"] Apr 20 23:13:33.185576 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.185555 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.194115 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.194094 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 23:13:33.194224 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.194132 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 23:13:33.194366 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.194322 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 23:13:33.194502 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.194370 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-gkntl\"" Apr 20 23:13:33.194585 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.194559 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 23:13:33.194645 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.194595 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7rj67alf0ckb9\"" Apr 20 23:13:33.194693 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.194668 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 23:13:33.199883 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.199865 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7dc9f489cd-29ctc"] Apr 20 23:13:33.228606 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.228579 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.228734 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.228672 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kjpg\" (UniqueName: \"kubernetes.io/projected/a74fdd55-c38d-4b4b-ba43-d7351d05d186-kube-api-access-9kjpg\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.228734 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.228701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.228734 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.228718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-grpc-tls\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.228885 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.228757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.228885 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.228828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-tls\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.228885 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.228870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a74fdd55-c38d-4b4b-ba43-d7351d05d186-metrics-client-ca\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.228978 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.228903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.329956 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.329925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kjpg\" (UniqueName: \"kubernetes.io/projected/a74fdd55-c38d-4b4b-ba43-d7351d05d186-kube-api-access-9kjpg\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.330126 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.329971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.330126 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.329999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-grpc-tls\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.330126 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.330038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.330126 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.330069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-tls\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.330126 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.330095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a74fdd55-c38d-4b4b-ba43-d7351d05d186-metrics-client-ca\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.330368 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.330351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.330996 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.330413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.331308 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.331280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a74fdd55-c38d-4b4b-ba43-d7351d05d186-metrics-client-ca\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.333245 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.333220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.333836 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.333593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-tls\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.333836 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.333606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-grpc-tls\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.333836 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.333694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.333836 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.333793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.334121 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.334092 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/a74fdd55-c38d-4b4b-ba43-d7351d05d186-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.337894 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.337876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kjpg\" (UniqueName: \"kubernetes.io/projected/a74fdd55-c38d-4b4b-ba43-d7351d05d186-kube-api-access-9kjpg\") pod \"thanos-querier-7dc9f489cd-29ctc\" (UID: \"a74fdd55-c38d-4b4b-ba43-d7351d05d186\") " pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:33.497109 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:33.497026 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:34.391695 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.391659 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6bc44454dd-hlmkm"] Apr 20 23:13:34.396227 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.396206 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.399035 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.399013 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 23:13:34.399139 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.399073 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 23:13:34.399827 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.399806 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 23:13:34.399947 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.399808 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-zfmp4\"" Apr 20 23:13:34.399947 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.399808 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 23:13:34.400058 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.399817 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1efp2bd42t3oq\"" Apr 20 23:13:34.405277 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.405216 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6bc44454dd-hlmkm"] Apr 20 23:13:34.439613 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.439580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-metrics-server-audit-profiles\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.439745 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.439662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.439745 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.439686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-audit-log\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.439848 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.439809 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-secret-metrics-server-client-certs\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.439848 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.439836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-secret-metrics-server-tls\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.439947 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.439873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mdvd\" (UniqueName: \"kubernetes.io/projected/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-kube-api-access-9mdvd\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.439986 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.439963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-client-ca-bundle\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.540980 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.540943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-client-ca-bundle\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.541123 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.540993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-metrics-server-audit-profiles\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.541123 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.541055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.541123 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.541083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-audit-log\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.541286 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.541150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-secret-metrics-server-client-certs\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.541286 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.541177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-secret-metrics-server-tls\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.541286 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.541228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mdvd\" (UniqueName: \"kubernetes.io/projected/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-kube-api-access-9mdvd\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.541689 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.541659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-audit-log\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.541932 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.541892 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.542889 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.542854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-metrics-server-audit-profiles\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.544291 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.544268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-secret-metrics-server-tls\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.544404 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.544368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-secret-metrics-server-client-certs\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.544537 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.544519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-client-ca-bundle\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.553198 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.553161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mdvd\" (UniqueName: \"kubernetes.io/projected/3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6-kube-api-access-9mdvd\") pod \"metrics-server-6bc44454dd-hlmkm\" (UID: \"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6\") " pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.709242 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.709162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:34.850114 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.850072 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm"] Apr 20 23:13:34.853372 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.853348 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" Apr 20 23:13:34.855900 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.855875 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-ml2cv\"" Apr 20 23:13:34.856040 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.855962 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 23:13:34.862961 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.862938 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm"] Apr 20 23:13:34.945587 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:34.945554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7b72f22d-cde7-4c91-aedc-a1ff5211db09-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qrkkm\" (UID: \"7b72f22d-cde7-4c91-aedc-a1ff5211db09\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" Apr 20 23:13:35.046036 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:35.045994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7b72f22d-cde7-4c91-aedc-a1ff5211db09-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qrkkm\" (UID: \"7b72f22d-cde7-4c91-aedc-a1ff5211db09\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" Apr 20 23:13:35.048689 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:35.048663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7b72f22d-cde7-4c91-aedc-a1ff5211db09-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-qrkkm\" (UID: \"7b72f22d-cde7-4c91-aedc-a1ff5211db09\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" Apr 20 23:13:35.166702 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:35.166666 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" Apr 20 23:13:36.311421 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.311385 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 23:13:36.337951 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.337308 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 23:13:36.338175 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.338006 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.340561 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.340537 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 23:13:36.341071 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.340844 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 23:13:36.341071 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.340885 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 23:13:36.341071 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.340904 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wxjxh\"" Apr 20 23:13:36.341406 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.341383 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 23:13:36.341646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.341629 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 23:13:36.341716 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.341675 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-do8m9p5jjaptq\"" Apr 20 23:13:36.341767 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.341716 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 23:13:36.342013 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.341990 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 23:13:36.342261 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.342028 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 23:13:36.342261 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.342116 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 23:13:36.342261 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.342192 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 23:13:36.343547 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.343525 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 23:13:36.347000 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.344825 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 23:13:36.359228 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359351 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359257 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-config\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359351 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359351 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359536 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359536 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359481 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4bg5\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-kube-api-access-z4bg5\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359536 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359683 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-config-out\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359683 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359683 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359683 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359655 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359712 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-web-config\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359836 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.359868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.360138 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.359890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461212 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-config\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461212 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4bg5\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-kube-api-access-z4bg5\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461683 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461737 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-config-out\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461792 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461792 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461792 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461938 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461938 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-web-config\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461938 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461885 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.461938 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.462137 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.461978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.462137 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.462004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.462137 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.462033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.462137 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.462076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.462385 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.462356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.462539 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.462514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.462539 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.462525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.466137 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.465827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.466238 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.466136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.466497 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.466449 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.466763 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.465835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.468522 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.468452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.469195 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.469171 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.469661 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.469639 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.469838 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.469785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-config-out\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.469957 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.469937 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.470602 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.470555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4bg5\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-kube-api-access-z4bg5\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.471056 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.471035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-config\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.471056 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.471047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.471415 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.471393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.472192 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.472169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-web-config\") pod \"prometheus-k8s-0\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.651083 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.651005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:36.950793 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:36.950711 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69bf6599f5-62ddf" Apr 20 23:13:37.103132 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.103091 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85f7d6b6bc-hqrxv"] Apr 20 23:13:37.142843 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.142812 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85f7d6b6bc-hqrxv"] Apr 20 23:13:37.143030 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.142956 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.151689 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.151278 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 23:13:37.169928 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.169893 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-serving-cert\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.170041 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.169959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-console-config\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.170041 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.169988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-oauth-serving-cert\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.170041 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.170030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prr25\" (UniqueName: \"kubernetes.io/projected/923b7cec-5778-4d64-8d78-dab978911499-kube-api-access-prr25\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.170151 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.170101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-service-ca\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.170199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.170183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-trusted-ca-bundle\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.170235 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.170214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-oauth-config\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.270988 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.270890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-trusted-ca-bundle\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.270988 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.270956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-oauth-config\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.271210 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.270999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-serving-cert\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.271210 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.271039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-console-config\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.271210 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.271062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-oauth-serving-cert\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.271210 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.271105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prr25\" (UniqueName: \"kubernetes.io/projected/923b7cec-5778-4d64-8d78-dab978911499-kube-api-access-prr25\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.271210 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.271150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-service-ca\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.271800 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.271762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-trusted-ca-bundle\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.272089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.272012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-service-ca\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.272089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.272041 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-console-config\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.272206 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.272065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-oauth-serving-cert\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.273744 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.273724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-serving-cert\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.273874 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.273855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-oauth-config\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.278896 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.278874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prr25\" (UniqueName: \"kubernetes.io/projected/923b7cec-5778-4d64-8d78-dab978911499-kube-api-access-prr25\") pod \"console-85f7d6b6bc-hqrxv\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.373619 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:37.373482 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e9fc71_0f77_4c32_a564_7860aee3bd59.slice/crio-c04eb429ec7a15bfe6f072b08e251869793631ad049ac35602a9326e7d493970 WatchSource:0}: Error finding container c04eb429ec7a15bfe6f072b08e251869793631ad049ac35602a9326e7d493970: Status 404 returned error can't find the container with id c04eb429ec7a15bfe6f072b08e251869793631ad049ac35602a9326e7d493970 Apr 20 23:13:37.455836 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.455295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:37.611163 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.610826 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6bc44454dd-hlmkm"] Apr 20 23:13:37.633890 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.633820 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc"] Apr 20 23:13:37.644197 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.644153 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7dc9f489cd-29ctc"] Apr 20 23:13:37.717851 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:37.717806 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f5b2ea0_50c2_47c7_a5e4_ab66e92b4be6.slice/crio-34b67e385f09229cbea3fc0500723beec7a2413bb7e6f6e434a9b0f278e047e1 WatchSource:0}: Error finding container 34b67e385f09229cbea3fc0500723beec7a2413bb7e6f6e434a9b0f278e047e1: Status 404 returned error can't find the container with id 34b67e385f09229cbea3fc0500723beec7a2413bb7e6f6e434a9b0f278e047e1 Apr 20 23:13:37.718341 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:37.718318 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77759cea_016e_42bd_9b8e_28f3fe6613aa.slice/crio-d6806d96d9c24345597530e0b0e9d7e5a7904ce3aafac6aeeb98aca7a3b778fb WatchSource:0}: Error finding container d6806d96d9c24345597530e0b0e9d7e5a7904ce3aafac6aeeb98aca7a3b778fb: Status 404 returned error can't find the container with id d6806d96d9c24345597530e0b0e9d7e5a7904ce3aafac6aeeb98aca7a3b778fb Apr 20 23:13:37.721307 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.721284 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85f7d6b6bc-hqrxv"] Apr 20 23:13:37.727818 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:37.727792 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74fdd55_c38d_4b4b_ba43_d7351d05d186.slice/crio-a948ee2d60029eeef59c66cf8604328422e20770a0a9be4e7947621db8cddece WatchSource:0}: Error finding container a948ee2d60029eeef59c66cf8604328422e20770a0a9be4e7947621db8cddece: Status 404 returned error can't find the container with id a948ee2d60029eeef59c66cf8604328422e20770a0a9be4e7947621db8cddece Apr 20 23:13:37.728770 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:37.728747 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923b7cec_5778_4d64_8d78_dab978911499.slice/crio-89fc98dd01677a584b44a9ccbc60a29a28c4304ad43191fde617dabfbd2b3fa7 WatchSource:0}: Error finding container 89fc98dd01677a584b44a9ccbc60a29a28c4304ad43191fde617dabfbd2b3fa7: Status 404 returned error can't find the container with id 89fc98dd01677a584b44a9ccbc60a29a28c4304ad43191fde617dabfbd2b3fa7 Apr 20 23:13:37.864951 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.864904 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm"] Apr 20 23:13:37.869669 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:37.869626 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b72f22d_cde7_4c91_aedc_a1ff5211db09.slice/crio-5f577de58352fe5d371c49168436cb1c075eb02bb6bd1946f24f7491d43483e1 WatchSource:0}: Error finding container 5f577de58352fe5d371c49168436cb1c075eb02bb6bd1946f24f7491d43483e1: Status 404 returned error can't find the container with id 5f577de58352fe5d371c49168436cb1c075eb02bb6bd1946f24f7491d43483e1 Apr 20 23:13:37.876135 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.875632 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 23:13:37.883254 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:37.883208 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6db23973_ceed_4ff3_9e3b_84706a628966.slice/crio-d6fd16fd5163e0cdf9d53b7bd29d96361906f790f969d5b86d8b2192113e4bfe WatchSource:0}: Error finding container d6fd16fd5163e0cdf9d53b7bd29d96361906f790f969d5b86d8b2192113e4bfe: Status 404 returned error can't find the container with id d6fd16fd5163e0cdf9d53b7bd29d96361906f790f969d5b86d8b2192113e4bfe Apr 20 23:13:37.905153 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.905103 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7vzql"] Apr 20 23:13:37.907099 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:37.907057 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb62cafb7_0ff7_4abf_ad06_094bdd2b3e31.slice/crio-1fe1fac47cf5a85128cebb3ade1a72e7a49edb4a8fcbe0e630da5e20300156ad WatchSource:0}: Error finding container 1fe1fac47cf5a85128cebb3ade1a72e7a49edb4a8fcbe0e630da5e20300156ad: Status 404 returned error can't find the container with id 1fe1fac47cf5a85128cebb3ade1a72e7a49edb4a8fcbe0e630da5e20300156ad Apr 20 23:13:37.914332 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:37.914082 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 23:13:37.919050 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:37.919021 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podada96a23_00e6_4a4d_81d1_bf42436e01d8.slice/crio-7c92168c5f8801e19828140af2b1373477ac69f0d436fd7350652d3c33883eb0 WatchSource:0}: Error finding container 7c92168c5f8801e19828140af2b1373477ac69f0d436fd7350652d3c33883eb0: Status 404 returned error can't find the container with id 7c92168c5f8801e19828140af2b1373477ac69f0d436fd7350652d3c33883eb0 Apr 20 23:13:38.032324 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.032292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f7d6b6bc-hqrxv" event={"ID":"923b7cec-5778-4d64-8d78-dab978911499","Type":"ContainerStarted","Data":"89fc98dd01677a584b44a9ccbc60a29a28c4304ad43191fde617dabfbd2b3fa7"} Apr 20 23:13:38.033872 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.033835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" event={"ID":"a74fdd55-c38d-4b4b-ba43-d7351d05d186","Type":"ContainerStarted","Data":"a948ee2d60029eeef59c66cf8604328422e20770a0a9be4e7947621db8cddece"} Apr 20 23:13:38.035255 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.035218 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerStarted","Data":"d6fd16fd5163e0cdf9d53b7bd29d96361906f790f969d5b86d8b2192113e4bfe"} Apr 20 23:13:38.036838 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.036792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" event={"ID":"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31","Type":"ContainerStarted","Data":"1fe1fac47cf5a85128cebb3ade1a72e7a49edb4a8fcbe0e630da5e20300156ad"} Apr 20 23:13:38.038329 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.038305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" event={"ID":"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6","Type":"ContainerStarted","Data":"34b67e385f09229cbea3fc0500723beec7a2413bb7e6f6e434a9b0f278e047e1"} Apr 20 23:13:38.040083 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.040059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" event={"ID":"77759cea-016e-42bd-9b8e-28f3fe6613aa","Type":"ContainerStarted","Data":"e91223948823febaa85132b5048562740091a4d3f208ba12370e8d28cf35ecc4"} Apr 20 23:13:38.040184 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.040087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" event={"ID":"77759cea-016e-42bd-9b8e-28f3fe6613aa","Type":"ContainerStarted","Data":"6cf76ec4e28e08917dc783193d675e9a0cca1fb029d98bb4c8e77915e2e6343e"} Apr 20 23:13:38.040184 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.040097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" event={"ID":"77759cea-016e-42bd-9b8e-28f3fe6613aa","Type":"ContainerStarted","Data":"d6806d96d9c24345597530e0b0e9d7e5a7904ce3aafac6aeeb98aca7a3b778fb"} Apr 20 23:13:38.043014 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.042956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m5xhr" event={"ID":"b412400c-7349-4590-ad92-575f2cb10591","Type":"ContainerStarted","Data":"c5b84d547818c54d10dd56de6ab0845e8a02cff2bcfa83621bd0183dc2950122"} Apr 20 23:13:38.043664 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.043613 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-m5xhr" Apr 20 23:13:38.045523 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.045210 2575 patch_prober.go:28] interesting pod/downloads-6bcc868b7-m5xhr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.10:8080/\": dial tcp 10.133.0.10:8080: connect: connection refused" start-of-body= Apr 20 23:13:38.045523 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.045258 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-m5xhr" podUID="b412400c-7349-4590-ad92-575f2cb10591" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.10:8080/\": dial tcp 10.133.0.10:8080: connect: connection refused" Apr 20 23:13:38.048859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.048834 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerStarted","Data":"7c92168c5f8801e19828140af2b1373477ac69f0d436fd7350652d3c33883eb0"} Apr 20 23:13:38.048962 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.048869 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" event={"ID":"7b72f22d-cde7-4c91-aedc-a1ff5211db09","Type":"ContainerStarted","Data":"5f577de58352fe5d371c49168436cb1c075eb02bb6bd1946f24f7491d43483e1"} Apr 20 23:13:38.048962 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.048886 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8fn88" event={"ID":"23e9fc71-0f77-4c32-a564-7860aee3bd59","Type":"ContainerStarted","Data":"c04eb429ec7a15bfe6f072b08e251869793631ad049ac35602a9326e7d493970"} Apr 20 23:13:38.060088 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.059975 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-m5xhr" podStartSLOduration=1.6966607439999999 podStartE2EDuration="19.059959499s" podCreationTimestamp="2026-04-20 23:13:19 +0000 UTC" firstStartedPulling="2026-04-20 23:13:20.439885337 +0000 UTC m=+48.241019025" lastFinishedPulling="2026-04-20 23:13:37.803184102 +0000 UTC m=+65.604317780" observedRunningTime="2026-04-20 23:13:38.059589445 +0000 UTC m=+65.860723154" watchObservedRunningTime="2026-04-20 23:13:38.059959499 +0000 UTC m=+65.861093177" Apr 20 23:13:38.492040 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.491963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:38.494687 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.494487 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 23:13:38.505544 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.505516 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd-metrics-certs\") pod \"network-metrics-daemon-qklww\" (UID: \"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd\") " pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:38.609505 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.609242 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sk2w6\"" Apr 20 23:13:38.618107 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.617992 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qklww" Apr 20 23:13:38.696525 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.696053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:38.699421 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.699235 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 23:13:38.711001 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.710972 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 23:13:38.727785 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.727703 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58k26\" (UniqueName: \"kubernetes.io/projected/45a7ee7b-b744-4b77-bc49-38abb3429332-kube-api-access-58k26\") pod \"network-check-target-b9mzc\" (UID: \"45a7ee7b-b744-4b77-bc49-38abb3429332\") " pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:38.802697 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.802615 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qklww"] Apr 20 23:13:38.812707 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:38.812655 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0efe962b_3a09_4fd9_99ca_a2e1bb52f3dd.slice/crio-53b1fff1bdadf64fbab1cd61f9e1f772b0d964af5a41c9ee1dda249b7dfb7981 WatchSource:0}: Error finding container 53b1fff1bdadf64fbab1cd61f9e1f772b0d964af5a41c9ee1dda249b7dfb7981: Status 404 returned error can't find the container with id 53b1fff1bdadf64fbab1cd61f9e1f772b0d964af5a41c9ee1dda249b7dfb7981 Apr 20 23:13:38.924069 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.924035 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g6bnb\"" Apr 20 23:13:38.931366 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:38.930938 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:39.078874 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:39.078824 2575 generic.go:358] "Generic (PLEG): container finished" podID="23e9fc71-0f77-4c32-a564-7860aee3bd59" containerID="c666924a817820bca2704ee13a82f7fb41ffd08e8f35f7e64a2abf4cec1907a3" exitCode=0 Apr 20 23:13:39.079111 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:39.079044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8fn88" event={"ID":"23e9fc71-0f77-4c32-a564-7860aee3bd59","Type":"ContainerDied","Data":"c666924a817820bca2704ee13a82f7fb41ffd08e8f35f7e64a2abf4cec1907a3"} Apr 20 23:13:39.085573 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:39.085518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qklww" event={"ID":"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd","Type":"ContainerStarted","Data":"53b1fff1bdadf64fbab1cd61f9e1f772b0d964af5a41c9ee1dda249b7dfb7981"} Apr 20 23:13:39.092057 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:39.092024 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-m5xhr" Apr 20 23:13:39.185324 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:39.184657 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b9mzc"] Apr 20 23:13:39.201742 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:13:39.201686 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a7ee7b_b744_4b77_bc49_38abb3429332.slice/crio-af761323d6d6686708ac3e0e5d1c3caa2e1e4a8d1cf9bbc1eb418de34751b4c6 WatchSource:0}: Error finding container af761323d6d6686708ac3e0e5d1c3caa2e1e4a8d1cf9bbc1eb418de34751b4c6: Status 404 returned error can't find the container with id af761323d6d6686708ac3e0e5d1c3caa2e1e4a8d1cf9bbc1eb418de34751b4c6 Apr 20 23:13:40.109640 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:40.109596 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8fn88" event={"ID":"23e9fc71-0f77-4c32-a564-7860aee3bd59","Type":"ContainerStarted","Data":"ef0691bf4bea4738f87825695e24eb1bc6c6bb440fec95e4cfe8ef4d1cef890a"} Apr 20 23:13:40.109640 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:40.109643 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8fn88" event={"ID":"23e9fc71-0f77-4c32-a564-7860aee3bd59","Type":"ContainerStarted","Data":"4bd8292ab374dd7f3e67c09b09d079bf48ad2fa038fa5553c1a13fc508fba548"} Apr 20 23:13:40.112684 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:40.112638 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b9mzc" event={"ID":"45a7ee7b-b744-4b77-bc49-38abb3429332","Type":"ContainerStarted","Data":"af761323d6d6686708ac3e0e5d1c3caa2e1e4a8d1cf9bbc1eb418de34751b4c6"} Apr 20 23:13:42.809905 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:42.809852 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8fn88" podStartSLOduration=11.573944835 podStartE2EDuration="12.809833333s" podCreationTimestamp="2026-04-20 23:13:30 +0000 UTC" firstStartedPulling="2026-04-20 23:13:37.37582173 +0000 UTC m=+65.176955419" lastFinishedPulling="2026-04-20 23:13:38.611710243 +0000 UTC m=+66.412843917" observedRunningTime="2026-04-20 23:13:40.133190795 +0000 UTC m=+67.934324506" watchObservedRunningTime="2026-04-20 23:13:42.809833333 +0000 UTC m=+70.610967028" Apr 20 23:13:49.167563 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.166909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-768777c665-tt2bp" event={"ID":"722b5e53-8135-4017-925f-fc1051d33783","Type":"ContainerStarted","Data":"a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52"} Apr 20 23:13:49.179457 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.177810 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f7d6b6bc-hqrxv" event={"ID":"923b7cec-5778-4d64-8d78-dab978911499","Type":"ContainerStarted","Data":"3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1"} Apr 20 23:13:49.183319 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.183248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" event={"ID":"a74fdd55-c38d-4b4b-ba43-d7351d05d186","Type":"ContainerStarted","Data":"1f33b10d937198d1b3ece03af5999f337858ec32e95829d0e8ee51d8b0312153"} Apr 20 23:13:49.183319 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.183281 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" event={"ID":"a74fdd55-c38d-4b4b-ba43-d7351d05d186","Type":"ContainerStarted","Data":"52e431608b60baee59adaf6c4971646168ed343ec1b0770c36daa790c070a318"} Apr 20 23:13:49.185742 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.184892 2575 generic.go:358] "Generic (PLEG): container finished" podID="6db23973-ceed-4ff3-9e3b-84706a628966" containerID="fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780" exitCode=0 Apr 20 23:13:49.185742 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.184974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerDied","Data":"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780"} Apr 20 23:13:49.188851 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.188090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" event={"ID":"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31","Type":"ContainerStarted","Data":"23a5c5e7067bf8ea4470911d8b9d416c29995511d31d7a526bee87b1e2a8abe7"} Apr 20 23:13:49.188851 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.188118 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" event={"ID":"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31","Type":"ContainerStarted","Data":"31fb9ddc8f619cc7268dc7d9600b932afd46ffc0afa3ff9c7bb89b6af713332c"} Apr 20 23:13:49.188851 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.188131 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" event={"ID":"b62cafb7-0ff7-4abf-ad06-094bdd2b3e31","Type":"ContainerStarted","Data":"5a90461591daf1a05cda7a1e322e6a50bdc510be65199afe113560ebc8d1be26"} Apr 20 23:13:49.192453 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.192373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" event={"ID":"3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6","Type":"ContainerStarted","Data":"e8d9fd38c1a7a31d364af66e709b0eef29860c2fd8e380c17f6db6bafc88d1e0"} Apr 20 23:13:49.195504 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.194110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b9mzc" event={"ID":"45a7ee7b-b744-4b77-bc49-38abb3429332","Type":"ContainerStarted","Data":"3665a31f1e0de4c3a72b96883e6d9e2f6e369cd8300465c813aa5bb73a57aa0b"} Apr 20 23:13:49.195504 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.194536 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-768777c665-tt2bp" podStartSLOduration=1.343426978 podStartE2EDuration="20.194521766s" podCreationTimestamp="2026-04-20 23:13:29 +0000 UTC" firstStartedPulling="2026-04-20 23:13:29.798515397 +0000 UTC m=+57.599649074" lastFinishedPulling="2026-04-20 23:13:48.649610173 +0000 UTC m=+76.450743862" observedRunningTime="2026-04-20 23:13:49.189708616 +0000 UTC m=+76.990842310" watchObservedRunningTime="2026-04-20 23:13:49.194521766 +0000 UTC m=+76.995655463" Apr 20 23:13:49.201318 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.201223 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" event={"ID":"77759cea-016e-42bd-9b8e-28f3fe6613aa","Type":"ContainerStarted","Data":"d7aa2b464795bd3bc1da3037fdf5cfd1fba74a2ca316e38f9a953a369cb402b9"} Apr 20 23:13:49.204920 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.204167 2575 generic.go:358] "Generic (PLEG): container finished" podID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerID="aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a" exitCode=0 Apr 20 23:13:49.204920 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.204225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerDied","Data":"aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a"} Apr 20 23:13:49.209067 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.208815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qklww" event={"ID":"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd","Type":"ContainerStarted","Data":"4a6716049f167c6d93b19e66f9df99966cb6bbfae8835ea6237bec145fbe0f3d"} Apr 20 23:13:49.211389 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.210861 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:13:49.212215 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.212185 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" event={"ID":"7b72f22d-cde7-4c91-aedc-a1ff5211db09","Type":"ContainerStarted","Data":"0b2709892babb9eacef54f4fe49d728e4664b1805dc73c57818695bf33b317e8"} Apr 20 23:13:49.212544 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.212515 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" Apr 20 23:13:49.218739 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.218708 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" Apr 20 23:13:49.254162 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.253499 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-7vzql" podStartSLOduration=8.521180752 podStartE2EDuration="19.253481691s" podCreationTimestamp="2026-04-20 23:13:30 +0000 UTC" firstStartedPulling="2026-04-20 23:13:37.909372647 +0000 UTC m=+65.710506338" lastFinishedPulling="2026-04-20 23:13:48.641673588 +0000 UTC m=+76.442807277" observedRunningTime="2026-04-20 23:13:49.25064374 +0000 UTC m=+77.051777433" watchObservedRunningTime="2026-04-20 23:13:49.253481691 +0000 UTC m=+77.054615379" Apr 20 23:13:49.269120 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.269081 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85f7d6b6bc-hqrxv" podStartSLOduration=1.381612152 podStartE2EDuration="12.269068022s" podCreationTimestamp="2026-04-20 23:13:37 +0000 UTC" firstStartedPulling="2026-04-20 23:13:37.754205114 +0000 UTC m=+65.555338791" lastFinishedPulling="2026-04-20 23:13:48.641660972 +0000 UTC m=+76.442794661" observedRunningTime="2026-04-20 23:13:49.266696869 +0000 UTC m=+77.067830566" watchObservedRunningTime="2026-04-20 23:13:49.269068022 +0000 UTC m=+77.070201718" Apr 20 23:13:49.286499 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.286316 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b9mzc" podStartSLOduration=66.858368762 podStartE2EDuration="1m16.286303985s" podCreationTimestamp="2026-04-20 23:12:33 +0000 UTC" firstStartedPulling="2026-04-20 23:13:39.215093208 +0000 UTC m=+67.016226882" lastFinishedPulling="2026-04-20 23:13:48.643028428 +0000 UTC m=+76.444162105" observedRunningTime="2026-04-20 23:13:49.285730582 +0000 UTC m=+77.086864282" watchObservedRunningTime="2026-04-20 23:13:49.286303985 +0000 UTC m=+77.087437680" Apr 20 23:13:49.327339 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.327277 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" podStartSLOduration=4.407566068 podStartE2EDuration="15.327258904s" podCreationTimestamp="2026-04-20 23:13:34 +0000 UTC" firstStartedPulling="2026-04-20 23:13:37.720542136 +0000 UTC m=+65.521675812" lastFinishedPulling="2026-04-20 23:13:48.640234961 +0000 UTC m=+76.441368648" observedRunningTime="2026-04-20 23:13:49.325498956 +0000 UTC m=+77.126632650" watchObservedRunningTime="2026-04-20 23:13:49.327258904 +0000 UTC m=+77.128392601" Apr 20 23:13:49.339764 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.339715 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-qrkkm" podStartSLOduration=4.574742346 podStartE2EDuration="15.339701044s" podCreationTimestamp="2026-04-20 23:13:34 +0000 UTC" firstStartedPulling="2026-04-20 23:13:37.874947515 +0000 UTC m=+65.676081195" lastFinishedPulling="2026-04-20 23:13:48.639906206 +0000 UTC m=+76.441039893" observedRunningTime="2026-04-20 23:13:49.339261242 +0000 UTC m=+77.140394939" watchObservedRunningTime="2026-04-20 23:13:49.339701044 +0000 UTC m=+77.140834740" Apr 20 23:13:49.358376 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.358319 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79qxc" podStartSLOduration=8.746305803 podStartE2EDuration="19.358297451s" podCreationTimestamp="2026-04-20 23:13:30 +0000 UTC" firstStartedPulling="2026-04-20 23:13:37.95230579 +0000 UTC m=+65.753439471" lastFinishedPulling="2026-04-20 23:13:48.564297429 +0000 UTC m=+76.365431119" observedRunningTime="2026-04-20 23:13:49.356804958 +0000 UTC m=+77.157938656" watchObservedRunningTime="2026-04-20 23:13:49.358297451 +0000 UTC m=+77.159431149" Apr 20 23:13:49.660764 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.660734 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:49.660911 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.660772 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:49.665564 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:49.665540 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:50.221257 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:50.221222 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qklww" event={"ID":"0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd","Type":"ContainerStarted","Data":"87fb7b725e70db3ad813828e61552c88b997e5ff22c500d6f9dd3ba7d5e9f25f"} Apr 20 23:13:50.227035 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:50.226950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" event={"ID":"a74fdd55-c38d-4b4b-ba43-d7351d05d186","Type":"ContainerStarted","Data":"ea67233e3e516b85c0aa8a9a9506dddfa16cd395acdef79f01f750a5c8734f47"} Apr 20 23:13:50.234142 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:50.234118 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:13:50.238180 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:50.238143 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qklww" podStartSLOduration=68.414240176 podStartE2EDuration="1m18.238131684s" podCreationTimestamp="2026-04-20 23:12:32 +0000 UTC" firstStartedPulling="2026-04-20 23:13:38.818651327 +0000 UTC m=+66.619785003" lastFinishedPulling="2026-04-20 23:13:48.642542829 +0000 UTC m=+76.443676511" observedRunningTime="2026-04-20 23:13:50.236747167 +0000 UTC m=+78.037880860" watchObservedRunningTime="2026-04-20 23:13:50.238131684 +0000 UTC m=+78.039265402" Apr 20 23:13:52.874848 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:13:52.874815 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podada96a23_00e6_4a4d_81d1_bf42436e01d8.slice/crio-ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00.scope\": RecentStats: unable to find data in memory cache]" Apr 20 23:13:53.247697 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.247669 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" event={"ID":"a74fdd55-c38d-4b4b-ba43-d7351d05d186","Type":"ContainerStarted","Data":"4321e86642e252790f3c9a13c99c5719e96c8fd5959c603875c56b9def36b65a"} Apr 20 23:13:53.247776 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.247710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" event={"ID":"a74fdd55-c38d-4b4b-ba43-d7351d05d186","Type":"ContainerStarted","Data":"0a35043b4b8a3c6c803a6837278166a39d85c6f593de3584fe3ab5389437fbfc"} Apr 20 23:13:53.247776 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.247741 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" event={"ID":"a74fdd55-c38d-4b4b-ba43-d7351d05d186","Type":"ContainerStarted","Data":"d3da88cb70f94e31bc65e7c3ac4bf3a8bb741973d2d46320ff962b744a14e7ac"} Apr 20 23:13:53.247889 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.247869 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:53.250375 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.250353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerStarted","Data":"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752"} Apr 20 23:13:53.250455 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.250378 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerStarted","Data":"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926"} Apr 20 23:13:53.255717 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.255630 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerStarted","Data":"0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0"} Apr 20 23:13:53.255717 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.255660 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerStarted","Data":"cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c"} Apr 20 23:13:53.255717 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.255672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerStarted","Data":"ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00"} Apr 20 23:13:53.255717 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.255685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerStarted","Data":"b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c"} Apr 20 23:13:53.255717 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.255696 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerStarted","Data":"a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f"} Apr 20 23:13:53.273240 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:53.273189 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" podStartSLOduration=5.673847526 podStartE2EDuration="20.273171979s" podCreationTimestamp="2026-04-20 23:13:33 +0000 UTC" firstStartedPulling="2026-04-20 23:13:37.75423602 +0000 UTC m=+65.555369720" lastFinishedPulling="2026-04-20 23:13:52.353560499 +0000 UTC m=+80.154694173" observedRunningTime="2026-04-20 23:13:53.271253683 +0000 UTC m=+81.072387379" watchObservedRunningTime="2026-04-20 23:13:53.273171979 +0000 UTC m=+81.074305678" Apr 20 23:13:54.261923 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.261892 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerStarted","Data":"1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc"} Apr 20 23:13:54.264842 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.264801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerStarted","Data":"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905"} Apr 20 23:13:54.264842 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.264841 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerStarted","Data":"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b"} Apr 20 23:13:54.265012 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.264852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerStarted","Data":"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4"} Apr 20 23:13:54.265012 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.264862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerStarted","Data":"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350"} Apr 20 23:13:54.270571 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.270554 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7dc9f489cd-29ctc" Apr 20 23:13:54.286655 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.286617 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=8.854720788 podStartE2EDuration="23.286605257s" podCreationTimestamp="2026-04-20 23:13:31 +0000 UTC" firstStartedPulling="2026-04-20 23:13:37.921679549 +0000 UTC m=+65.722813228" lastFinishedPulling="2026-04-20 23:13:52.35356402 +0000 UTC m=+80.154697697" observedRunningTime="2026-04-20 23:13:54.284938393 +0000 UTC m=+82.086072088" watchObservedRunningTime="2026-04-20 23:13:54.286605257 +0000 UTC m=+82.087738953" Apr 20 23:13:54.332944 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.332888 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.41149204 podStartE2EDuration="18.332871183s" podCreationTimestamp="2026-04-20 23:13:36 +0000 UTC" firstStartedPulling="2026-04-20 23:13:37.885196031 +0000 UTC m=+65.686329706" lastFinishedPulling="2026-04-20 23:13:52.806575156 +0000 UTC m=+80.607708849" observedRunningTime="2026-04-20 23:13:54.32982092 +0000 UTC m=+82.130954617" watchObservedRunningTime="2026-04-20 23:13:54.332871183 +0000 UTC m=+82.134004881" Apr 20 23:13:54.709558 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.709531 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:54.709558 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:54.709561 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:13:56.651396 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:56.651368 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:13:57.456243 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:57.456206 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:57.456243 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:57.456245 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:57.460723 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:57.460704 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:58.282257 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:58.282230 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:13:58.325280 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:13:58.325247 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-768777c665-tt2bp"] Apr 20 23:14:14.714729 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:14.714702 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:14:14.718738 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:14.718716 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6bc44454dd-hlmkm" Apr 20 23:14:21.232889 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:21.232854 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b9mzc" Apr 20 23:14:23.345314 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.345249 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-768777c665-tt2bp" podUID="722b5e53-8135-4017-925f-fc1051d33783" containerName="console" containerID="cri-o://a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52" gracePeriod=15 Apr 20 23:14:23.608633 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.608612 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-768777c665-tt2bp_722b5e53-8135-4017-925f-fc1051d33783/console/0.log" Apr 20 23:14:23.608728 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.608680 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:14:23.703611 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.703583 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-oauth-serving-cert\") pod \"722b5e53-8135-4017-925f-fc1051d33783\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " Apr 20 23:14:23.703755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.703640 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-oauth-config\") pod \"722b5e53-8135-4017-925f-fc1051d33783\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " Apr 20 23:14:23.703755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.703658 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-service-ca\") pod \"722b5e53-8135-4017-925f-fc1051d33783\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " Apr 20 23:14:23.703755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.703686 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-serving-cert\") pod \"722b5e53-8135-4017-925f-fc1051d33783\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " Apr 20 23:14:23.703755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.703709 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npxbd\" (UniqueName: \"kubernetes.io/projected/722b5e53-8135-4017-925f-fc1051d33783-kube-api-access-npxbd\") pod \"722b5e53-8135-4017-925f-fc1051d33783\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " Apr 20 23:14:23.703755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.703723 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-console-config\") pod \"722b5e53-8135-4017-925f-fc1051d33783\" (UID: \"722b5e53-8135-4017-925f-fc1051d33783\") " Apr 20 23:14:23.704059 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.704031 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "722b5e53-8135-4017-925f-fc1051d33783" (UID: "722b5e53-8135-4017-925f-fc1051d33783"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:23.704158 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.704049 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-service-ca" (OuterVolumeSpecName: "service-ca") pod "722b5e53-8135-4017-925f-fc1051d33783" (UID: "722b5e53-8135-4017-925f-fc1051d33783"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:23.704272 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.704243 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-console-config" (OuterVolumeSpecName: "console-config") pod "722b5e53-8135-4017-925f-fc1051d33783" (UID: "722b5e53-8135-4017-925f-fc1051d33783"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:23.706064 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.706043 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "722b5e53-8135-4017-925f-fc1051d33783" (UID: "722b5e53-8135-4017-925f-fc1051d33783"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:23.706144 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.706094 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722b5e53-8135-4017-925f-fc1051d33783-kube-api-access-npxbd" (OuterVolumeSpecName: "kube-api-access-npxbd") pod "722b5e53-8135-4017-925f-fc1051d33783" (UID: "722b5e53-8135-4017-925f-fc1051d33783"). InnerVolumeSpecName "kube-api-access-npxbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:14:23.706196 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.706175 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "722b5e53-8135-4017-925f-fc1051d33783" (UID: "722b5e53-8135-4017-925f-fc1051d33783"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:23.804743 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.804719 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-oauth-serving-cert\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:23.804743 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.804742 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-oauth-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:23.804888 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.804752 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-service-ca\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:23.804888 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.804762 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/722b5e53-8135-4017-925f-fc1051d33783-console-serving-cert\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:23.804888 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.804770 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-npxbd\" (UniqueName: \"kubernetes.io/projected/722b5e53-8135-4017-925f-fc1051d33783-kube-api-access-npxbd\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:23.804888 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:23.804779 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/722b5e53-8135-4017-925f-fc1051d33783-console-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:24.363645 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.363617 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-768777c665-tt2bp_722b5e53-8135-4017-925f-fc1051d33783/console/0.log" Apr 20 23:14:24.364124 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.363658 2575 generic.go:358] "Generic (PLEG): container finished" podID="722b5e53-8135-4017-925f-fc1051d33783" containerID="a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52" exitCode=2 Apr 20 23:14:24.364124 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.363693 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-768777c665-tt2bp" event={"ID":"722b5e53-8135-4017-925f-fc1051d33783","Type":"ContainerDied","Data":"a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52"} Apr 20 23:14:24.364124 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.363735 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-768777c665-tt2bp" event={"ID":"722b5e53-8135-4017-925f-fc1051d33783","Type":"ContainerDied","Data":"bde1b37c250ffbf1117a2307fc8b2a2444ccb51cbaf854035b5309b070efc1f8"} Apr 20 23:14:24.364124 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.363741 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-768777c665-tt2bp" Apr 20 23:14:24.364124 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.363750 2575 scope.go:117] "RemoveContainer" containerID="a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52" Apr 20 23:14:24.375721 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.375705 2575 scope.go:117] "RemoveContainer" containerID="a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52" Apr 20 23:14:24.375991 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:24.375972 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52\": container with ID starting with a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52 not found: ID does not exist" containerID="a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52" Apr 20 23:14:24.376053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.376008 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52"} err="failed to get container status \"a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52\": rpc error: code = NotFound desc = could not find container \"a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52\": container with ID starting with a22c4de341b3891c7e49198ef3340843029e60b7b99bb226f83fcf027868cd52 not found: ID does not exist" Apr 20 23:14:24.383448 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.383427 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-768777c665-tt2bp"] Apr 20 23:14:24.387931 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.387902 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-768777c665-tt2bp"] Apr 20 23:14:24.782782 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:24.782687 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722b5e53-8135-4017-925f-fc1051d33783" path="/var/lib/kubelet/pods/722b5e53-8135-4017-925f-fc1051d33783/volumes" Apr 20 23:14:36.652235 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:36.652201 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:36.671954 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:36.671931 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:37.418759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:37.418732 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:50.365252 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:50.365215 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 23:14:50.365745 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:50.365657 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="alertmanager" containerID="cri-o://a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f" gracePeriod=120 Apr 20 23:14:50.365745 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:50.365715 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy-metric" containerID="cri-o://0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0" gracePeriod=120 Apr 20 23:14:50.365859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:50.365741 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy-web" containerID="cri-o://ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00" gracePeriod=120 Apr 20 23:14:50.365859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:50.365764 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy" containerID="cri-o://cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c" gracePeriod=120 Apr 20 23:14:50.365859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:50.365810 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="prom-label-proxy" containerID="cri-o://1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc" gracePeriod=120 Apr 20 23:14:50.365859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:50.365768 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="config-reloader" containerID="cri-o://b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c" gracePeriod=120 Apr 20 23:14:51.447546 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447519 2575 generic.go:358] "Generic (PLEG): container finished" podID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerID="1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc" exitCode=0 Apr 20 23:14:51.447546 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447543 2575 generic.go:358] "Generic (PLEG): container finished" podID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerID="0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0" exitCode=0 Apr 20 23:14:51.447845 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447551 2575 generic.go:358] "Generic (PLEG): container finished" podID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerID="cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c" exitCode=0 Apr 20 23:14:51.447845 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447558 2575 generic.go:358] "Generic (PLEG): container finished" podID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerID="b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c" exitCode=0 Apr 20 23:14:51.447845 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447568 2575 generic.go:358] "Generic (PLEG): container finished" podID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerID="a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f" exitCode=0 Apr 20 23:14:51.447845 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447585 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerDied","Data":"1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc"} Apr 20 23:14:51.447845 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerDied","Data":"0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0"} Apr 20 23:14:51.447845 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerDied","Data":"cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c"} Apr 20 23:14:51.447845 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447640 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerDied","Data":"b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c"} Apr 20 23:14:51.447845 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.447651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerDied","Data":"a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f"} Apr 20 23:14:51.621119 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.621095 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:51.730538 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730435 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-metric\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.730538 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730501 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-volume\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.730538 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730523 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-cluster-tls-config\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.730538 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730540 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-web\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.730856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730564 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28w5h\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-kube-api-access-28w5h\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.730856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730616 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-tls-assets\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.730856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730792 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-main-db\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.730856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730837 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-web-config\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.731054 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730896 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-main-tls\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.731054 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.730987 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-out\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.731159 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.731065 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-trusted-ca-bundle\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.731211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.731176 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:14:51.731266 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.731242 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-metrics-client-ca\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.731928 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.731892 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:51.732072 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.731974 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:51.732330 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.732308 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy\") pod \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\" (UID: \"ada96a23-00e6-4a4d-81d1-bf42436e01d8\") " Apr 20 23:14:51.732745 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.732725 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.732830 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.732751 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ada96a23-00e6-4a4d-81d1-bf42436e01d8-metrics-client-ca\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.732830 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.732764 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-alertmanager-main-db\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.733622 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.733593 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:14:51.733731 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.733681 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-volume" (OuterVolumeSpecName: "config-volume") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:51.733889 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.733842 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:51.734427 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.734397 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-out" (OuterVolumeSpecName: "config-out") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:14:51.734567 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.734519 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-kube-api-access-28w5h" (OuterVolumeSpecName: "kube-api-access-28w5h") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "kube-api-access-28w5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:14:51.734673 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.734643 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:51.734919 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.734897 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:51.736167 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.736144 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:51.738897 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.738873 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:51.746989 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.746969 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-web-config" (OuterVolumeSpecName: "web-config") pod "ada96a23-00e6-4a4d-81d1-bf42436e01d8" (UID: "ada96a23-00e6-4a4d-81d1-bf42436e01d8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:51.833569 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833546 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-tls-assets\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.833569 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833565 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-web-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.833569 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833575 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-main-tls\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.833715 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833583 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-out\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.833715 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833594 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.833715 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833604 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.833715 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833613 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-config-volume\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.833715 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833622 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-cluster-tls-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.833715 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833630 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ada96a23-00e6-4a4d-81d1-bf42436e01d8-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:51.833715 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:51.833640 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28w5h\" (UniqueName: \"kubernetes.io/projected/ada96a23-00e6-4a4d-81d1-bf42436e01d8-kube-api-access-28w5h\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:52.453655 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.453623 2575 generic.go:358] "Generic (PLEG): container finished" podID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerID="ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00" exitCode=0 Apr 20 23:14:52.453968 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.453680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerDied","Data":"ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00"} Apr 20 23:14:52.453968 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.453715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ada96a23-00e6-4a4d-81d1-bf42436e01d8","Type":"ContainerDied","Data":"7c92168c5f8801e19828140af2b1373477ac69f0d436fd7350652d3c33883eb0"} Apr 20 23:14:52.453968 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.453736 2575 scope.go:117] "RemoveContainer" containerID="1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc" Apr 20 23:14:52.453968 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.453749 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.463209 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.463188 2575 scope.go:117] "RemoveContainer" containerID="0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0" Apr 20 23:14:52.470018 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.469999 2575 scope.go:117] "RemoveContainer" containerID="cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c" Apr 20 23:14:52.476443 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.476426 2575 scope.go:117] "RemoveContainer" containerID="ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00" Apr 20 23:14:52.478666 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.478645 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 23:14:52.484169 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.484121 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 23:14:52.503297 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.503143 2575 scope.go:117] "RemoveContainer" containerID="b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.511907 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512478 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512510 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512535 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy-web" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512545 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy-web" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512557 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="alertmanager" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512567 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="alertmanager" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512579 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy-metric" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512586 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy-metric" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512602 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="prom-label-proxy" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512611 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="prom-label-proxy" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512625 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="config-reloader" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512634 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="config-reloader" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512645 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="init-config-reloader" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512653 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="init-config-reloader" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512670 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="722b5e53-8135-4017-925f-fc1051d33783" containerName="console" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512678 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="722b5e53-8135-4017-925f-fc1051d33783" containerName="console" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512782 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy-metric" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512804 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="prom-label-proxy" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512816 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy-web" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512828 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="alertmanager" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512838 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="config-reloader" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512848 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" containerName="kube-rbac-proxy" Apr 20 23:14:52.513495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.512859 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="722b5e53-8135-4017-925f-fc1051d33783" containerName="console" Apr 20 23:14:52.518707 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.518687 2575 scope.go:117] "RemoveContainer" containerID="a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f" Apr 20 23:14:52.518891 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.518872 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.521449 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.521426 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 23:14:52.521600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.521426 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 23:14:52.521600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.521520 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 23:14:52.521600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.521434 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 23:14:52.521600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.521434 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 23:14:52.521964 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.521936 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 23:14:52.522053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.522020 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 23:14:52.522364 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.522344 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 23:14:52.522455 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.522403 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8hhs4\"" Apr 20 23:14:52.527311 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.527284 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 23:14:52.527999 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.527952 2575 scope.go:117] "RemoveContainer" containerID="aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a" Apr 20 23:14:52.530380 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.530361 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 23:14:52.537243 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.537227 2575 scope.go:117] "RemoveContainer" containerID="1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc" Apr 20 23:14:52.537509 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:52.537487 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc\": container with ID starting with 1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc not found: ID does not exist" containerID="1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc" Apr 20 23:14:52.537587 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.537520 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc"} err="failed to get container status \"1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc\": rpc error: code = NotFound desc = could not find container \"1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc\": container with ID starting with 1ae1cc0752199fe8078a8f1ddd8c8ad7f5416311dae118a37a9664226811c4cc not found: ID does not exist" Apr 20 23:14:52.537587 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.537543 2575 scope.go:117] "RemoveContainer" containerID="0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0" Apr 20 23:14:52.537778 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:52.537759 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0\": container with ID starting with 0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0 not found: ID does not exist" containerID="0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0" Apr 20 23:14:52.537843 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.537783 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0"} err="failed to get container status \"0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0\": rpc error: code = NotFound desc = could not find container \"0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0\": container with ID starting with 0683daed489172395f85cd38a83f71029e35bd971e93713ceed9d49fa0e8acd0 not found: ID does not exist" Apr 20 23:14:52.537843 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.537799 2575 scope.go:117] "RemoveContainer" containerID="cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c" Apr 20 23:14:52.538038 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:52.538021 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c\": container with ID starting with cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c not found: ID does not exist" containerID="cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c" Apr 20 23:14:52.538100 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.538043 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c"} err="failed to get container status \"cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c\": rpc error: code = NotFound desc = could not find container \"cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c\": container with ID starting with cd2fec4070ca1c199ba0e038d41e7d56813fffdb5dd492b6a3ffdfcbca80726c not found: ID does not exist" Apr 20 23:14:52.538100 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.538058 2575 scope.go:117] "RemoveContainer" containerID="ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00" Apr 20 23:14:52.538275 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:52.538258 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00\": container with ID starting with ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00 not found: ID does not exist" containerID="ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00" Apr 20 23:14:52.538332 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.538278 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00"} err="failed to get container status \"ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00\": rpc error: code = NotFound desc = could not find container \"ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00\": container with ID starting with ca94fb6edc4091fa8fa42d33cc71fbab72677c1e883a251d65ac2537910e0c00 not found: ID does not exist" Apr 20 23:14:52.538332 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.538293 2575 scope.go:117] "RemoveContainer" containerID="b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c" Apr 20 23:14:52.538538 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:52.538516 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c\": container with ID starting with b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c not found: ID does not exist" containerID="b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c" Apr 20 23:14:52.538605 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.538543 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c"} err="failed to get container status \"b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c\": rpc error: code = NotFound desc = could not find container \"b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c\": container with ID starting with b0f3e62fb311c03c2b8b0684b4682a38d7c42939a0bc50443a9706d3a6e27b3c not found: ID does not exist" Apr 20 23:14:52.538605 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.538562 2575 scope.go:117] "RemoveContainer" containerID="a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f" Apr 20 23:14:52.538789 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:52.538770 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f\": container with ID starting with a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f not found: ID does not exist" containerID="a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f" Apr 20 23:14:52.538849 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.538792 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f"} err="failed to get container status \"a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f\": rpc error: code = NotFound desc = could not find container \"a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f\": container with ID starting with a83314ca637ceffe01eda6afea4893a777e6ad1ba45ffc2023cf464d8f20893f not found: ID does not exist" Apr 20 23:14:52.538849 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.538808 2575 scope.go:117] "RemoveContainer" containerID="aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a" Apr 20 23:14:52.539025 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:52.539010 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a\": container with ID starting with aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a not found: ID does not exist" containerID="aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a" Apr 20 23:14:52.539085 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.539029 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a"} err="failed to get container status \"aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a\": rpc error: code = NotFound desc = could not find container \"aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a\": container with ID starting with aea86b6726d3e21bf218d84b44eaaa19bf085790a5d28dfbf9051a11473ce06a not found: ID does not exist" Apr 20 23:14:52.641111 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641091 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641209 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-config-out\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641209 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641209 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641209 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641347 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641347 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641243 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-config-volume\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641347 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641347 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b646\" (UniqueName: \"kubernetes.io/projected/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-kube-api-access-8b646\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641545 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641545 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641545 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641459 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.641545 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.641536 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-web-config\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.741893 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.741829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.741893 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.741860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b646\" (UniqueName: \"kubernetes.io/projected/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-kube-api-access-8b646\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.741893 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.741886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742079 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.741906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742079 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.741927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742079 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.741967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-web-config\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742079 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.741997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742079 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.742027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-config-out\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742079 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.742044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742079 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.742081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742435 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.742099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742435 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.742115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.742435 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.742132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-config-volume\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.743223 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.742649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.743814 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.743542 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.743976 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.743880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.745293 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.745156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.745831 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.745809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-config-volume\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.746325 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.746165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.746521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.746456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.746521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.746493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.746770 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.746747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.746870 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.746750 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.747205 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.747175 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-web-config\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.747421 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.747398 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-config-out\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.750809 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.750778 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b646\" (UniqueName: \"kubernetes.io/projected/6aef86e5-4b10-45e5-aae2-d4b46c4879cc-kube-api-access-8b646\") pod \"alertmanager-main-0\" (UID: \"6aef86e5-4b10-45e5-aae2-d4b46c4879cc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.781955 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.781929 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada96a23-00e6-4a4d-81d1-bf42436e01d8" path="/var/lib/kubelet/pods/ada96a23-00e6-4a4d-81d1-bf42436e01d8/volumes" Apr 20 23:14:52.834579 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.834554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 23:14:52.963132 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:52.963104 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 23:14:52.965844 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:14:52.965813 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aef86e5_4b10_45e5_aae2_d4b46c4879cc.slice/crio-e71cd91378d1a0ef1a6cb4d2d1f676c7a6ef602249f2bc243fdb0f534fc229f3 WatchSource:0}: Error finding container e71cd91378d1a0ef1a6cb4d2d1f676c7a6ef602249f2bc243fdb0f534fc229f3: Status 404 returned error can't find the container with id e71cd91378d1a0ef1a6cb4d2d1f676c7a6ef602249f2bc243fdb0f534fc229f3 Apr 20 23:14:53.458541 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:53.458508 2575 generic.go:358] "Generic (PLEG): container finished" podID="6aef86e5-4b10-45e5-aae2-d4b46c4879cc" containerID="1925eb5414daa12d6da3a0ad920086a41f9d5ee3fe03b4ef1e1f2e0237dc9cf9" exitCode=0 Apr 20 23:14:53.458874 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:53.458560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6aef86e5-4b10-45e5-aae2-d4b46c4879cc","Type":"ContainerDied","Data":"1925eb5414daa12d6da3a0ad920086a41f9d5ee3fe03b4ef1e1f2e0237dc9cf9"} Apr 20 23:14:53.458874 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:53.458598 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6aef86e5-4b10-45e5-aae2-d4b46c4879cc","Type":"ContainerStarted","Data":"e71cd91378d1a0ef1a6cb4d2d1f676c7a6ef602249f2bc243fdb0f534fc229f3"} Apr 20 23:14:54.465851 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.465814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6aef86e5-4b10-45e5-aae2-d4b46c4879cc","Type":"ContainerStarted","Data":"949ba685c5074c59148dc68eea0d73ea986493a614d0cf22ff661beab9c09380"} Apr 20 23:14:54.465851 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.465852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6aef86e5-4b10-45e5-aae2-d4b46c4879cc","Type":"ContainerStarted","Data":"21eb1cd6d44b4f596b7bc24c6dffe7ffd10c42bc732b0b04def7d3423a887c40"} Apr 20 23:14:54.466324 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.465864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6aef86e5-4b10-45e5-aae2-d4b46c4879cc","Type":"ContainerStarted","Data":"1611aeadf95c995fe4021f9adf031d1fe1b9e898813bd80c22e79ec360809818"} Apr 20 23:14:54.466324 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.465876 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6aef86e5-4b10-45e5-aae2-d4b46c4879cc","Type":"ContainerStarted","Data":"64d9a8b576fd4f562246941a22ddac619938ce0d54eccba13b5690ccac9ded14"} Apr 20 23:14:54.466324 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.465886 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6aef86e5-4b10-45e5-aae2-d4b46c4879cc","Type":"ContainerStarted","Data":"76db143809ab3e3f6ad059486546015748e2d9e5ef9beec9d123a7520bfb7e08"} Apr 20 23:14:54.466324 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.465897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6aef86e5-4b10-45e5-aae2-d4b46c4879cc","Type":"ContainerStarted","Data":"268d93edc18b0af7262ad6aaf5a5d97e833565e9f2eec5a98de42dadb0a76e46"} Apr 20 23:14:54.491802 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.491761 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.4917467110000002 podStartE2EDuration="2.491746711s" podCreationTimestamp="2026-04-20 23:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:14:54.490070883 +0000 UTC m=+142.291204590" watchObservedRunningTime="2026-04-20 23:14:54.491746711 +0000 UTC m=+142.292880406" Apr 20 23:14:54.699073 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.699031 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 23:14:54.699910 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.699873 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="prometheus" containerID="cri-o://5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" gracePeriod=600 Apr 20 23:14:54.700059 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.699895 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="thanos-sidecar" containerID="cri-o://1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" gracePeriod=600 Apr 20 23:14:54.700155 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.699910 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy-thanos" containerID="cri-o://d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" gracePeriod=600 Apr 20 23:14:54.700242 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.699923 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy-web" containerID="cri-o://abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" gracePeriod=600 Apr 20 23:14:54.700242 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.699895 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy" containerID="cri-o://f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" gracePeriod=600 Apr 20 23:14:54.700382 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.699954 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="config-reloader" containerID="cri-o://ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" gracePeriod=600 Apr 20 23:14:54.960560 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:54.960536 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.063653 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.063627 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.063787 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.063667 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.063787 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.063686 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-serving-certs-ca-bundle\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.063787 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.063716 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-tls-assets\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.063787 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.063748 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-config-out\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.063787 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.063774 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-trusted-ca-bundle\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064007 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.063797 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-grpc-tls\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064007 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.063836 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-metrics-client-certs\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064122 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064074 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:55.064188 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064164 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4bg5\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-kube-api-access-z4bg5\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064238 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064205 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-thanos-prometheus-http-client-file\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064238 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064212 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:55.064340 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064276 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-metrics-client-ca\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064340 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064304 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-db\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064340 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064334 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-kube-rbac-proxy\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064510 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064416 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-kubelet-serving-ca-bundle\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064606 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064582 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-tls\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064669 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064618 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-config\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064669 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064653 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-web-config\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.064828 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064678 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-rulefiles-0\") pod \"6db23973-ceed-4ff3-9e3b-84706a628966\" (UID: \"6db23973-ceed-4ff3-9e3b-84706a628966\") " Apr 20 23:14:55.066086 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.064981 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.066086 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.065002 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.066086 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.065346 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:55.066086 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.065766 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:55.066370 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.066158 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:14:55.066785 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.066695 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:14:55.067584 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.067526 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-kube-api-access-z4bg5" (OuterVolumeSpecName: "kube-api-access-z4bg5") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "kube-api-access-z4bg5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:14:55.068204 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.067964 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:55.068204 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.068034 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:55.068204 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.068104 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-config-out" (OuterVolumeSpecName: "config-out") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:14:55.068204 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.068161 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:14:55.068669 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.068642 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-config" (OuterVolumeSpecName: "config") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:55.068782 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.068661 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:55.068845 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.068786 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:55.069259 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.069235 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:55.069259 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.069249 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:55.069417 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.069393 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:55.079723 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.079699 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-web-config" (OuterVolumeSpecName: "web-config") pod "6db23973-ceed-4ff3-9e3b-84706a628966" (UID: "6db23973-ceed-4ff3-9e3b-84706a628966"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:14:55.166082 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166061 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166082 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166081 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-tls-assets\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166090 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-config-out\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166098 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-grpc-tls\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166108 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-metrics-client-certs\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166116 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4bg5\" (UniqueName: \"kubernetes.io/projected/6db23973-ceed-4ff3-9e3b-84706a628966-kube-api-access-z4bg5\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166125 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166135 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-metrics-client-ca\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166143 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-db\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166151 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-kube-rbac-proxy\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166159 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166167 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166177 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166185 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-web-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166199 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166193 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6db23973-ceed-4ff3-9e3b-84706a628966-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.166591 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.166202 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6db23973-ceed-4ff3-9e3b-84706a628966-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:14:55.471728 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471700 2575 generic.go:358] "Generic (PLEG): container finished" podID="6db23973-ceed-4ff3-9e3b-84706a628966" containerID="d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" exitCode=0 Apr 20 23:14:55.471728 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471722 2575 generic.go:358] "Generic (PLEG): container finished" podID="6db23973-ceed-4ff3-9e3b-84706a628966" containerID="f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" exitCode=0 Apr 20 23:14:55.471728 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471728 2575 generic.go:358] "Generic (PLEG): container finished" podID="6db23973-ceed-4ff3-9e3b-84706a628966" containerID="abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" exitCode=0 Apr 20 23:14:55.471728 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471733 2575 generic.go:358] "Generic (PLEG): container finished" podID="6db23973-ceed-4ff3-9e3b-84706a628966" containerID="1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" exitCode=0 Apr 20 23:14:55.471728 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471738 2575 generic.go:358] "Generic (PLEG): container finished" podID="6db23973-ceed-4ff3-9e3b-84706a628966" containerID="ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" exitCode=0 Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471742 2575 generic.go:358] "Generic (PLEG): container finished" podID="6db23973-ceed-4ff3-9e3b-84706a628966" containerID="5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" exitCode=0 Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471793 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerDied","Data":"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905"} Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471828 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471843 2575 scope.go:117] "RemoveContainer" containerID="d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerDied","Data":"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b"} Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471943 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerDied","Data":"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4"} Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471958 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerDied","Data":"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350"} Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471967 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerDied","Data":"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752"} Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471977 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerDied","Data":"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926"} Apr 20 23:14:55.472326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.471990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6db23973-ceed-4ff3-9e3b-84706a628966","Type":"ContainerDied","Data":"d6fd16fd5163e0cdf9d53b7bd29d96361906f790f969d5b86d8b2192113e4bfe"} Apr 20 23:14:55.480282 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.480156 2575 scope.go:117] "RemoveContainer" containerID="f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" Apr 20 23:14:55.487157 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.487135 2575 scope.go:117] "RemoveContainer" containerID="abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" Apr 20 23:14:55.493320 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.493302 2575 scope.go:117] "RemoveContainer" containerID="1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" Apr 20 23:14:55.499515 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.499454 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 23:14:55.500403 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.500382 2575 scope.go:117] "RemoveContainer" containerID="ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" Apr 20 23:14:55.504063 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.504033 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 23:14:55.507666 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.507649 2575 scope.go:117] "RemoveContainer" containerID="5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" Apr 20 23:14:55.514238 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.514222 2575 scope.go:117] "RemoveContainer" containerID="fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780" Apr 20 23:14:55.520441 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.520425 2575 scope.go:117] "RemoveContainer" containerID="d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" Apr 20 23:14:55.520757 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:55.520727 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": container with ID starting with d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905 not found: ID does not exist" containerID="d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" Apr 20 23:14:55.520875 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.520764 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905"} err="failed to get container status \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": rpc error: code = NotFound desc = could not find container \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": container with ID starting with d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905 not found: ID does not exist" Apr 20 23:14:55.520875 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.520788 2575 scope.go:117] "RemoveContainer" containerID="f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" Apr 20 23:14:55.521144 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:55.521124 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": container with ID starting with f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b not found: ID does not exist" containerID="f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" Apr 20 23:14:55.521220 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.521157 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b"} err="failed to get container status \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": rpc error: code = NotFound desc = could not find container \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": container with ID starting with f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b not found: ID does not exist" Apr 20 23:14:55.521220 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.521172 2575 scope.go:117] "RemoveContainer" containerID="abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" Apr 20 23:14:55.521427 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:55.521410 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": container with ID starting with abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4 not found: ID does not exist" containerID="abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" Apr 20 23:14:55.521516 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.521437 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4"} err="failed to get container status \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": rpc error: code = NotFound desc = could not find container \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": container with ID starting with abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4 not found: ID does not exist" Apr 20 23:14:55.521516 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.521459 2575 scope.go:117] "RemoveContainer" containerID="1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" Apr 20 23:14:55.521698 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:55.521678 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": container with ID starting with 1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350 not found: ID does not exist" containerID="1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" Apr 20 23:14:55.521736 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.521706 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350"} err="failed to get container status \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": rpc error: code = NotFound desc = could not find container \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": container with ID starting with 1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350 not found: ID does not exist" Apr 20 23:14:55.521736 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.521723 2575 scope.go:117] "RemoveContainer" containerID="ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" Apr 20 23:14:55.521916 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:55.521902 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": container with ID starting with ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752 not found: ID does not exist" containerID="ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" Apr 20 23:14:55.521963 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.521919 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752"} err="failed to get container status \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": rpc error: code = NotFound desc = could not find container \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": container with ID starting with ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752 not found: ID does not exist" Apr 20 23:14:55.521963 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.521933 2575 scope.go:117] "RemoveContainer" containerID="5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" Apr 20 23:14:55.522110 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:55.522097 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": container with ID starting with 5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926 not found: ID does not exist" containerID="5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" Apr 20 23:14:55.522146 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522113 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926"} err="failed to get container status \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": rpc error: code = NotFound desc = could not find container \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": container with ID starting with 5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926 not found: ID does not exist" Apr 20 23:14:55.522146 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522124 2575 scope.go:117] "RemoveContainer" containerID="fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780" Apr 20 23:14:55.522352 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:14:55.522336 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": container with ID starting with fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780 not found: ID does not exist" containerID="fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780" Apr 20 23:14:55.522393 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522356 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780"} err="failed to get container status \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": rpc error: code = NotFound desc = could not find container \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": container with ID starting with fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780 not found: ID does not exist" Apr 20 23:14:55.522393 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522368 2575 scope.go:117] "RemoveContainer" containerID="d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" Apr 20 23:14:55.522590 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522571 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905"} err="failed to get container status \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": rpc error: code = NotFound desc = could not find container \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": container with ID starting with d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905 not found: ID does not exist" Apr 20 23:14:55.522639 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522591 2575 scope.go:117] "RemoveContainer" containerID="f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" Apr 20 23:14:55.522770 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522751 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b"} err="failed to get container status \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": rpc error: code = NotFound desc = could not find container \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": container with ID starting with f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b not found: ID does not exist" Apr 20 23:14:55.522820 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522772 2575 scope.go:117] "RemoveContainer" containerID="abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" Apr 20 23:14:55.522995 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522977 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4"} err="failed to get container status \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": rpc error: code = NotFound desc = could not find container \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": container with ID starting with abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4 not found: ID does not exist" Apr 20 23:14:55.522995 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.522994 2575 scope.go:117] "RemoveContainer" containerID="1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" Apr 20 23:14:55.523217 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523201 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350"} err="failed to get container status \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": rpc error: code = NotFound desc = could not find container \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": container with ID starting with 1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350 not found: ID does not exist" Apr 20 23:14:55.523257 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523218 2575 scope.go:117] "RemoveContainer" containerID="ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" Apr 20 23:14:55.523420 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523405 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752"} err="failed to get container status \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": rpc error: code = NotFound desc = could not find container \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": container with ID starting with ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752 not found: ID does not exist" Apr 20 23:14:55.523486 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523421 2575 scope.go:117] "RemoveContainer" containerID="5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" Apr 20 23:14:55.523612 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523595 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926"} err="failed to get container status \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": rpc error: code = NotFound desc = could not find container \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": container with ID starting with 5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926 not found: ID does not exist" Apr 20 23:14:55.523656 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523613 2575 scope.go:117] "RemoveContainer" containerID="fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780" Apr 20 23:14:55.523780 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523761 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780"} err="failed to get container status \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": rpc error: code = NotFound desc = could not find container \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": container with ID starting with fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780 not found: ID does not exist" Apr 20 23:14:55.523821 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523780 2575 scope.go:117] "RemoveContainer" containerID="d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" Apr 20 23:14:55.523949 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523932 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905"} err="failed to get container status \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": rpc error: code = NotFound desc = could not find container \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": container with ID starting with d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905 not found: ID does not exist" Apr 20 23:14:55.523990 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.523950 2575 scope.go:117] "RemoveContainer" containerID="f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" Apr 20 23:14:55.524141 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524123 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b"} err="failed to get container status \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": rpc error: code = NotFound desc = could not find container \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": container with ID starting with f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b not found: ID does not exist" Apr 20 23:14:55.524189 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524142 2575 scope.go:117] "RemoveContainer" containerID="abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" Apr 20 23:14:55.524359 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524341 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4"} err="failed to get container status \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": rpc error: code = NotFound desc = could not find container \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": container with ID starting with abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4 not found: ID does not exist" Apr 20 23:14:55.524359 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524358 2575 scope.go:117] "RemoveContainer" containerID="1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" Apr 20 23:14:55.524553 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524538 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350"} err="failed to get container status \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": rpc error: code = NotFound desc = could not find container \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": container with ID starting with 1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350 not found: ID does not exist" Apr 20 23:14:55.524594 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524554 2575 scope.go:117] "RemoveContainer" containerID="ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" Apr 20 23:14:55.524746 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524728 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752"} err="failed to get container status \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": rpc error: code = NotFound desc = could not find container \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": container with ID starting with ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752 not found: ID does not exist" Apr 20 23:14:55.524783 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524747 2575 scope.go:117] "RemoveContainer" containerID="5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" Apr 20 23:14:55.524934 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524919 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926"} err="failed to get container status \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": rpc error: code = NotFound desc = could not find container \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": container with ID starting with 5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926 not found: ID does not exist" Apr 20 23:14:55.524973 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.524934 2575 scope.go:117] "RemoveContainer" containerID="fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780" Apr 20 23:14:55.525113 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525096 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780"} err="failed to get container status \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": rpc error: code = NotFound desc = could not find container \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": container with ID starting with fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780 not found: ID does not exist" Apr 20 23:14:55.525161 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525114 2575 scope.go:117] "RemoveContainer" containerID="d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" Apr 20 23:14:55.525260 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525246 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905"} err="failed to get container status \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": rpc error: code = NotFound desc = could not find container \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": container with ID starting with d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905 not found: ID does not exist" Apr 20 23:14:55.525294 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525261 2575 scope.go:117] "RemoveContainer" containerID="f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" Apr 20 23:14:55.525424 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525410 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b"} err="failed to get container status \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": rpc error: code = NotFound desc = could not find container \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": container with ID starting with f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b not found: ID does not exist" Apr 20 23:14:55.525424 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525424 2575 scope.go:117] "RemoveContainer" containerID="abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" Apr 20 23:14:55.525612 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525597 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4"} err="failed to get container status \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": rpc error: code = NotFound desc = could not find container \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": container with ID starting with abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4 not found: ID does not exist" Apr 20 23:14:55.525651 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525614 2575 scope.go:117] "RemoveContainer" containerID="1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" Apr 20 23:14:55.525823 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525807 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350"} err="failed to get container status \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": rpc error: code = NotFound desc = could not find container \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": container with ID starting with 1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350 not found: ID does not exist" Apr 20 23:14:55.525877 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525823 2575 scope.go:117] "RemoveContainer" containerID="ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" Apr 20 23:14:55.525993 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525979 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752"} err="failed to get container status \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": rpc error: code = NotFound desc = could not find container \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": container with ID starting with ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752 not found: ID does not exist" Apr 20 23:14:55.526035 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.525995 2575 scope.go:117] "RemoveContainer" containerID="5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" Apr 20 23:14:55.526218 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.526200 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926"} err="failed to get container status \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": rpc error: code = NotFound desc = could not find container \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": container with ID starting with 5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926 not found: ID does not exist" Apr 20 23:14:55.526255 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.526218 2575 scope.go:117] "RemoveContainer" containerID="fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780" Apr 20 23:14:55.526418 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.526398 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780"} err="failed to get container status \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": rpc error: code = NotFound desc = could not find container \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": container with ID starting with fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780 not found: ID does not exist" Apr 20 23:14:55.526478 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.526419 2575 scope.go:117] "RemoveContainer" containerID="d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" Apr 20 23:14:55.526631 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.526612 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905"} err="failed to get container status \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": rpc error: code = NotFound desc = could not find container \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": container with ID starting with d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905 not found: ID does not exist" Apr 20 23:14:55.526669 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.526634 2575 scope.go:117] "RemoveContainer" containerID="f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" Apr 20 23:14:55.526832 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.526811 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b"} err="failed to get container status \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": rpc error: code = NotFound desc = could not find container \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": container with ID starting with f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b not found: ID does not exist" Apr 20 23:14:55.526892 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.526832 2575 scope.go:117] "RemoveContainer" containerID="abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" Apr 20 23:14:55.527041 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.527023 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4"} err="failed to get container status \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": rpc error: code = NotFound desc = could not find container \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": container with ID starting with abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4 not found: ID does not exist" Apr 20 23:14:55.527088 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.527042 2575 scope.go:117] "RemoveContainer" containerID="1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" Apr 20 23:14:55.527232 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.527218 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350"} err="failed to get container status \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": rpc error: code = NotFound desc = could not find container \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": container with ID starting with 1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350 not found: ID does not exist" Apr 20 23:14:55.527275 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.527232 2575 scope.go:117] "RemoveContainer" containerID="ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" Apr 20 23:14:55.527426 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.527409 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752"} err="failed to get container status \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": rpc error: code = NotFound desc = could not find container \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": container with ID starting with ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752 not found: ID does not exist" Apr 20 23:14:55.527523 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.527426 2575 scope.go:117] "RemoveContainer" containerID="5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" Apr 20 23:14:55.527935 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.527912 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926"} err="failed to get container status \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": rpc error: code = NotFound desc = could not find container \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": container with ID starting with 5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926 not found: ID does not exist" Apr 20 23:14:55.528027 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.527937 2575 scope.go:117] "RemoveContainer" containerID="fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780" Apr 20 23:14:55.528258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.528237 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780"} err="failed to get container status \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": rpc error: code = NotFound desc = could not find container \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": container with ID starting with fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780 not found: ID does not exist" Apr 20 23:14:55.528258 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.528258 2575 scope.go:117] "RemoveContainer" containerID="d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905" Apr 20 23:14:55.528602 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.528573 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905"} err="failed to get container status \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": rpc error: code = NotFound desc = could not find container \"d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905\": container with ID starting with d8f81bcda6ca0d87caff06fae41993bd09ca86359a343a516f7acdf43d4d6905 not found: ID does not exist" Apr 20 23:14:55.528602 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.528592 2575 scope.go:117] "RemoveContainer" containerID="f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b" Apr 20 23:14:55.528821 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.528803 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b"} err="failed to get container status \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": rpc error: code = NotFound desc = could not find container \"f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b\": container with ID starting with f1c3d658cfabd4d04a783559d9c4bbd9f6df320dae504caa6980b1e67d42146b not found: ID does not exist" Apr 20 23:14:55.528904 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.528822 2575 scope.go:117] "RemoveContainer" containerID="abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4" Apr 20 23:14:55.529045 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.529020 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4"} err="failed to get container status \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": rpc error: code = NotFound desc = could not find container \"abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4\": container with ID starting with abdf7cac3bc6d2a23f36b44945a6fd1cada000efaa171680fcf10d2c3a61b6f4 not found: ID does not exist" Apr 20 23:14:55.529045 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.529036 2575 scope.go:117] "RemoveContainer" containerID="1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350" Apr 20 23:14:55.529243 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.529227 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350"} err="failed to get container status \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": rpc error: code = NotFound desc = could not find container \"1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350\": container with ID starting with 1c3f86e46d6e6e6354cb9dde0c029a393e2c5d14e63cafd0b5cb0c3974657350 not found: ID does not exist" Apr 20 23:14:55.529282 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.529243 2575 scope.go:117] "RemoveContainer" containerID="ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752" Apr 20 23:14:55.529439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.529422 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752"} err="failed to get container status \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": rpc error: code = NotFound desc = could not find container \"ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752\": container with ID starting with ab64146783e006d55fb1a8e8fa14f0c77c0a789d05e1d89c35981cfe2e134752 not found: ID does not exist" Apr 20 23:14:55.529439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.529439 2575 scope.go:117] "RemoveContainer" containerID="5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926" Apr 20 23:14:55.529709 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.529691 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926"} err="failed to get container status \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": rpc error: code = NotFound desc = could not find container \"5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926\": container with ID starting with 5f3317d2c158dd95dc1de15e31d01f9253f0f0f41deac85351cb02a609cf7926 not found: ID does not exist" Apr 20 23:14:55.529755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.529720 2575 scope.go:117] "RemoveContainer" containerID="fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780" Apr 20 23:14:55.529957 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.529936 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780"} err="failed to get container status \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": rpc error: code = NotFound desc = could not find container \"fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780\": container with ID starting with fc42dd2df4a2216e3e900fe6b4df2a29c307fd6df85352a187dac2cf239b1780 not found: ID does not exist" Apr 20 23:14:55.530255 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530234 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 23:14:55.530552 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530540 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy-thanos" Apr 20 23:14:55.530600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530554 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy-thanos" Apr 20 23:14:55.530600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530571 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy-web" Apr 20 23:14:55.530600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530576 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy-web" Apr 20 23:14:55.530600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530585 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy" Apr 20 23:14:55.530600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530590 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy" Apr 20 23:14:55.530600 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530598 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="config-reloader" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530604 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="config-reloader" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530610 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="thanos-sidecar" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530615 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="thanos-sidecar" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530625 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="prometheus" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530629 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="prometheus" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530637 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="init-config-reloader" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530642 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="init-config-reloader" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530682 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy-web" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530689 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="prometheus" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530696 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy-thanos" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530701 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="kube-rbac-proxy" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530707 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="thanos-sidecar" Apr 20 23:14:55.530759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.530713 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" containerName="config-reloader" Apr 20 23:14:55.537022 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.536994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.539513 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.539492 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 23:14:55.539644 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.539580 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 23:14:55.540027 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.540003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-wxjxh\"" Apr 20 23:14:55.541139 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.540645 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 23:14:55.541689 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.541668 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 23:14:55.541816 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.541725 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 23:14:55.542246 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.542231 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 23:14:55.542765 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.542748 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 23:14:55.543553 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.543533 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 23:14:55.544002 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.543985 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 23:14:55.544315 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.544295 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-do8m9p5jjaptq\"" Apr 20 23:14:55.547247 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.547225 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 23:14:55.549303 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.549276 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 23:14:55.550546 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.550520 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 23:14:55.551498 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.551448 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 23:14:55.671413 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671536 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671536 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671536 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671671 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-web-config\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671671 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/794c7306-b0dc-435f-9bd1-5359c1be1499-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671671 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671671 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671671 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671833 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671833 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671833 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/794c7306-b0dc-435f-9bd1-5359c1be1499-config-out\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671833 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671833 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/794c7306-b0dc-435f-9bd1-5359c1be1499-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.671833 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkgxg\" (UniqueName: \"kubernetes.io/projected/794c7306-b0dc-435f-9bd1-5359c1be1499-kube-api-access-mkgxg\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.672088 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.672088 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.672088 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.671946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-config\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772289 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772289 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772289 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-web-config\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/794c7306-b0dc-435f-9bd1-5359c1be1499-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/794c7306-b0dc-435f-9bd1-5359c1be1499-config-out\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/794c7306-b0dc-435f-9bd1-5359c1be1499-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkgxg\" (UniqueName: \"kubernetes.io/projected/794c7306-b0dc-435f-9bd1-5359c1be1499-kube-api-access-mkgxg\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-config\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.772867 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.772802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/794c7306-b0dc-435f-9bd1-5359c1be1499-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.773341 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.773044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.773341 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.773077 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.773341 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.773233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.774157 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.774133 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.775844 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.775515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/794c7306-b0dc-435f-9bd1-5359c1be1499-config-out\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.775844 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.775626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.775999 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.775974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-web-config\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.776589 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.776453 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.777009 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.776939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.777116 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.777089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/794c7306-b0dc-435f-9bd1-5359c1be1499-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.777782 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.777455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.777886 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.777837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.778180 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.778159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-config\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.778313 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.778297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.778378 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.778361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/794c7306-b0dc-435f-9bd1-5359c1be1499-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.778975 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.778959 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/794c7306-b0dc-435f-9bd1-5359c1be1499-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.782121 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.782098 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkgxg\" (UniqueName: \"kubernetes.io/projected/794c7306-b0dc-435f-9bd1-5359c1be1499-kube-api-access-mkgxg\") pod \"prometheus-k8s-0\" (UID: \"794c7306-b0dc-435f-9bd1-5359c1be1499\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.852286 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.852262 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:14:55.978580 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:55.978552 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 23:14:55.980971 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:14:55.980939 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod794c7306_b0dc_435f_9bd1_5359c1be1499.slice/crio-96fc3b75635a7d0d809276b78ffe2b35ca0ec857832740451a29033c3fa5dc27 WatchSource:0}: Error finding container 96fc3b75635a7d0d809276b78ffe2b35ca0ec857832740451a29033c3fa5dc27: Status 404 returned error can't find the container with id 96fc3b75635a7d0d809276b78ffe2b35ca0ec857832740451a29033c3fa5dc27 Apr 20 23:14:56.478248 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:56.478214 2575 generic.go:358] "Generic (PLEG): container finished" podID="794c7306-b0dc-435f-9bd1-5359c1be1499" containerID="17799f552c4a219f660f4e48684cbeeb25c1ad76c7c23baf07ebf9eabcf43706" exitCode=0 Apr 20 23:14:56.478572 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:56.478290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"794c7306-b0dc-435f-9bd1-5359c1be1499","Type":"ContainerDied","Data":"17799f552c4a219f660f4e48684cbeeb25c1ad76c7c23baf07ebf9eabcf43706"} Apr 20 23:14:56.478572 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:56.478312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"794c7306-b0dc-435f-9bd1-5359c1be1499","Type":"ContainerStarted","Data":"96fc3b75635a7d0d809276b78ffe2b35ca0ec857832740451a29033c3fa5dc27"} Apr 20 23:14:56.783444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:56.783419 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db23973-ceed-4ff3-9e3b-84706a628966" path="/var/lib/kubelet/pods/6db23973-ceed-4ff3-9e3b-84706a628966/volumes" Apr 20 23:14:57.488289 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:57.488256 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"794c7306-b0dc-435f-9bd1-5359c1be1499","Type":"ContainerStarted","Data":"0ea62ac1723c71ebcc407c56425124f6d746c9b8eb5bca5ec6f3558edd13cf39"} Apr 20 23:14:57.488289 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:57.488290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"794c7306-b0dc-435f-9bd1-5359c1be1499","Type":"ContainerStarted","Data":"b0ac65af9c30c3fcad424c728a5d1e6db9f3856ee31ab14453ae85222cabd392"} Apr 20 23:14:57.488777 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:57.488299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"794c7306-b0dc-435f-9bd1-5359c1be1499","Type":"ContainerStarted","Data":"bda972bf2f00865cd1e7f20031a97d94a4cba60034cd94a940ce4584f250d20d"} Apr 20 23:14:57.488777 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:57.488308 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"794c7306-b0dc-435f-9bd1-5359c1be1499","Type":"ContainerStarted","Data":"1ae402947ed1501d8f3d4410e007884dc313dbddd81eabb19d56d0b0743a1bd7"} Apr 20 23:14:57.488777 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:57.488316 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"794c7306-b0dc-435f-9bd1-5359c1be1499","Type":"ContainerStarted","Data":"cb24b7d11d80be8a9fdeb130c06a60b256810168c98f09464c5835db1c37a7c3"} Apr 20 23:14:57.488777 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:57.488323 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"794c7306-b0dc-435f-9bd1-5359c1be1499","Type":"ContainerStarted","Data":"517e32c5b975a5433d48291c346dfde513ce9929c7c2fd790802eaaf740bf447"} Apr 20 23:14:57.517341 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:57.517284 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.517266622 podStartE2EDuration="2.517266622s" podCreationTimestamp="2026-04-20 23:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:14:57.514788758 +0000 UTC m=+145.315922464" watchObservedRunningTime="2026-04-20 23:14:57.517266622 +0000 UTC m=+145.318400318" Apr 20 23:14:59.594376 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.594343 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f6fc98c95-69mk5"] Apr 20 23:14:59.598061 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.598032 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.609644 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.609621 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f6fc98c95-69mk5"] Apr 20 23:14:59.702429 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.702390 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-oauth-config\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.702429 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.702424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-console-config\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.702636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.702445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-trusted-ca-bundle\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.702636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.702491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-serving-cert\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.702636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.702579 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-service-ca\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.702636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.702631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-oauth-serving-cert\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.702770 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.702652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjlcr\" (UniqueName: \"kubernetes.io/projected/9635d186-390b-461b-9cd8-869eec113618-kube-api-access-vjlcr\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.804088 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.804053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-oauth-config\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.804088 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.804091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-console-config\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.804288 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.804119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-trusted-ca-bundle\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.804288 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.804162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-serving-cert\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.804288 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.804190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-service-ca\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.804288 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.804241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-oauth-serving-cert\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.804288 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.804271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjlcr\" (UniqueName: \"kubernetes.io/projected/9635d186-390b-461b-9cd8-869eec113618-kube-api-access-vjlcr\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.806837 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.805701 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-service-ca\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.806837 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.805738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-oauth-serving-cert\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.806837 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.805849 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-trusted-ca-bundle\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.806837 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.806047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-console-config\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.807592 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.807568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-serving-cert\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.807780 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.807757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-oauth-config\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.813344 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.813322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjlcr\" (UniqueName: \"kubernetes.io/projected/9635d186-390b-461b-9cd8-869eec113618-kube-api-access-vjlcr\") pod \"console-6f6fc98c95-69mk5\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:14:59.909495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:14:59.909410 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:15:00.029118 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:00.029081 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f6fc98c95-69mk5"] Apr 20 23:15:00.031950 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:15:00.031925 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9635d186_390b_461b_9cd8_869eec113618.slice/crio-dffa7f0328b9002b134cb093ca6266731f8b9003142ba28cd139fbce494d6a5b WatchSource:0}: Error finding container dffa7f0328b9002b134cb093ca6266731f8b9003142ba28cd139fbce494d6a5b: Status 404 returned error can't find the container with id dffa7f0328b9002b134cb093ca6266731f8b9003142ba28cd139fbce494d6a5b Apr 20 23:15:00.501893 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:00.501855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f6fc98c95-69mk5" event={"ID":"9635d186-390b-461b-9cd8-869eec113618","Type":"ContainerStarted","Data":"d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2"} Apr 20 23:15:00.501893 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:00.501898 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f6fc98c95-69mk5" event={"ID":"9635d186-390b-461b-9cd8-869eec113618","Type":"ContainerStarted","Data":"dffa7f0328b9002b134cb093ca6266731f8b9003142ba28cd139fbce494d6a5b"} Apr 20 23:15:00.519688 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:00.519638 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f6fc98c95-69mk5" podStartSLOduration=1.519626285 podStartE2EDuration="1.519626285s" podCreationTimestamp="2026-04-20 23:14:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:15:00.518099788 +0000 UTC m=+148.319233497" watchObservedRunningTime="2026-04-20 23:15:00.519626285 +0000 UTC m=+148.320759981" Apr 20 23:15:00.852920 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:00.852895 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:15:09.910359 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:09.910319 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:15:09.910824 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:09.910403 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:15:09.915558 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:09.915536 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:15:10.542292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:10.542267 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:15:10.594726 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:10.594699 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85f7d6b6bc-hqrxv"] Apr 20 23:15:35.623609 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:35.623561 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85f7d6b6bc-hqrxv" podUID="923b7cec-5778-4d64-8d78-dab978911499" containerName="console" containerID="cri-o://3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1" gracePeriod=15 Apr 20 23:15:35.868325 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:35.868300 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85f7d6b6bc-hqrxv_923b7cec-5778-4d64-8d78-dab978911499/console/0.log" Apr 20 23:15:35.868430 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:35.868373 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:15:36.007933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.007854 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-serving-cert\") pod \"923b7cec-5778-4d64-8d78-dab978911499\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " Apr 20 23:15:36.007933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.007891 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prr25\" (UniqueName: \"kubernetes.io/projected/923b7cec-5778-4d64-8d78-dab978911499-kube-api-access-prr25\") pod \"923b7cec-5778-4d64-8d78-dab978911499\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " Apr 20 23:15:36.007933 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.007924 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-oauth-config\") pod \"923b7cec-5778-4d64-8d78-dab978911499\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " Apr 20 23:15:36.008192 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.007951 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-service-ca\") pod \"923b7cec-5778-4d64-8d78-dab978911499\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " Apr 20 23:15:36.008192 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.007978 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-trusted-ca-bundle\") pod \"923b7cec-5778-4d64-8d78-dab978911499\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " Apr 20 23:15:36.008192 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.008051 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-oauth-serving-cert\") pod \"923b7cec-5778-4d64-8d78-dab978911499\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " Apr 20 23:15:36.008192 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.008093 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-console-config\") pod \"923b7cec-5778-4d64-8d78-dab978911499\" (UID: \"923b7cec-5778-4d64-8d78-dab978911499\") " Apr 20 23:15:36.008440 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.008389 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-service-ca" (OuterVolumeSpecName: "service-ca") pod "923b7cec-5778-4d64-8d78-dab978911499" (UID: "923b7cec-5778-4d64-8d78-dab978911499"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:15:36.008670 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.008450 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "923b7cec-5778-4d64-8d78-dab978911499" (UID: "923b7cec-5778-4d64-8d78-dab978911499"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:15:36.008670 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.008456 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "923b7cec-5778-4d64-8d78-dab978911499" (UID: "923b7cec-5778-4d64-8d78-dab978911499"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:15:36.008670 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.008654 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-console-config" (OuterVolumeSpecName: "console-config") pod "923b7cec-5778-4d64-8d78-dab978911499" (UID: "923b7cec-5778-4d64-8d78-dab978911499"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:15:36.010134 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.010113 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "923b7cec-5778-4d64-8d78-dab978911499" (UID: "923b7cec-5778-4d64-8d78-dab978911499"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:15:36.010554 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.010533 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "923b7cec-5778-4d64-8d78-dab978911499" (UID: "923b7cec-5778-4d64-8d78-dab978911499"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:15:36.010623 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.010552 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923b7cec-5778-4d64-8d78-dab978911499-kube-api-access-prr25" (OuterVolumeSpecName: "kube-api-access-prr25") pod "923b7cec-5778-4d64-8d78-dab978911499" (UID: "923b7cec-5778-4d64-8d78-dab978911499"). InnerVolumeSpecName "kube-api-access-prr25". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:15:36.109359 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.109339 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-oauth-serving-cert\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:15:36.109359 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.109359 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-console-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:15:36.109521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.109369 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-serving-cert\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:15:36.109521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.109381 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-prr25\" (UniqueName: \"kubernetes.io/projected/923b7cec-5778-4d64-8d78-dab978911499-kube-api-access-prr25\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:15:36.109521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.109389 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/923b7cec-5778-4d64-8d78-dab978911499-console-oauth-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:15:36.109521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.109398 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-service-ca\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:15:36.109521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.109406 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/923b7cec-5778-4d64-8d78-dab978911499-trusted-ca-bundle\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:15:36.624215 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.624191 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85f7d6b6bc-hqrxv_923b7cec-5778-4d64-8d78-dab978911499/console/0.log" Apr 20 23:15:36.624616 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.624232 2575 generic.go:358] "Generic (PLEG): container finished" podID="923b7cec-5778-4d64-8d78-dab978911499" containerID="3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1" exitCode=2 Apr 20 23:15:36.624616 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.624304 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85f7d6b6bc-hqrxv" Apr 20 23:15:36.624616 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.624327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f7d6b6bc-hqrxv" event={"ID":"923b7cec-5778-4d64-8d78-dab978911499","Type":"ContainerDied","Data":"3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1"} Apr 20 23:15:36.624616 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.624369 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85f7d6b6bc-hqrxv" event={"ID":"923b7cec-5778-4d64-8d78-dab978911499","Type":"ContainerDied","Data":"89fc98dd01677a584b44a9ccbc60a29a28c4304ad43191fde617dabfbd2b3fa7"} Apr 20 23:15:36.624616 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.624385 2575 scope.go:117] "RemoveContainer" containerID="3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1" Apr 20 23:15:36.633091 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.633076 2575 scope.go:117] "RemoveContainer" containerID="3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1" Apr 20 23:15:36.633326 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:15:36.633309 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1\": container with ID starting with 3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1 not found: ID does not exist" containerID="3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1" Apr 20 23:15:36.633384 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.633340 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1"} err="failed to get container status \"3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1\": rpc error: code = NotFound desc = could not find container \"3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1\": container with ID starting with 3459d88f775ac645d27572628239369f9ed5faefca5a23ecc30aeae68b4958e1 not found: ID does not exist" Apr 20 23:15:36.646416 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.646388 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85f7d6b6bc-hqrxv"] Apr 20 23:15:36.649645 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.649628 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85f7d6b6bc-hqrxv"] Apr 20 23:15:36.782564 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:36.782539 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="923b7cec-5778-4d64-8d78-dab978911499" path="/var/lib/kubelet/pods/923b7cec-5778-4d64-8d78-dab978911499/volumes" Apr 20 23:15:55.853243 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:55.853196 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:15:55.868960 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:55.868937 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:15:56.699900 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:15:56.699874 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 23:16:05.426649 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.426611 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-556c66679c-t75ds"] Apr 20 23:16:05.427030 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.426965 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="923b7cec-5778-4d64-8d78-dab978911499" containerName="console" Apr 20 23:16:05.427030 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.426975 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="923b7cec-5778-4d64-8d78-dab978911499" containerName="console" Apr 20 23:16:05.427030 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.427028 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="923b7cec-5778-4d64-8d78-dab978911499" containerName="console" Apr 20 23:16:05.431781 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.431751 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.442096 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.442069 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-serving-cert\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.442225 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.442136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-config\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.442225 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.442170 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-oauth-serving-cert\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.442225 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.442202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-trusted-ca-bundle\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.442404 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.442320 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-oauth-config\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.442404 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.442359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2mr6\" (UniqueName: \"kubernetes.io/projected/3ac8af9d-aaa7-4000-9535-2d4611bcad54-kube-api-access-g2mr6\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.442530 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.442404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-service-ca\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.443559 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.443539 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556c66679c-t75ds"] Apr 20 23:16:05.543211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.543182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-service-ca\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.543211 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.543217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-serving-cert\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.543520 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.543249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-config\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.543520 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.543363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-oauth-serving-cert\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.543520 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.543407 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-trusted-ca-bundle\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.543520 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.543484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-oauth-config\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.543520 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.543504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2mr6\" (UniqueName: \"kubernetes.io/projected/3ac8af9d-aaa7-4000-9535-2d4611bcad54-kube-api-access-g2mr6\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.544046 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.543961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-service-ca\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.544046 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.544004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-config\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.544046 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.544017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-oauth-serving-cert\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.544340 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.544269 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-trusted-ca-bundle\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.545852 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.545835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-serving-cert\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.545947 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.545930 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-oauth-config\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.552056 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.552032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2mr6\" (UniqueName: \"kubernetes.io/projected/3ac8af9d-aaa7-4000-9535-2d4611bcad54-kube-api-access-g2mr6\") pod \"console-556c66679c-t75ds\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.742318 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.742244 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:05.882268 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:05.882229 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-556c66679c-t75ds"] Apr 20 23:16:05.887767 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:16:05.887739 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ac8af9d_aaa7_4000_9535_2d4611bcad54.slice/crio-2d2dbacd775185d3b4a40beaf00eac341a7ce4296a73a7bd12ef8f6dfe399378 WatchSource:0}: Error finding container 2d2dbacd775185d3b4a40beaf00eac341a7ce4296a73a7bd12ef8f6dfe399378: Status 404 returned error can't find the container with id 2d2dbacd775185d3b4a40beaf00eac341a7ce4296a73a7bd12ef8f6dfe399378 Apr 20 23:16:06.716504 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:06.716455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556c66679c-t75ds" event={"ID":"3ac8af9d-aaa7-4000-9535-2d4611bcad54","Type":"ContainerStarted","Data":"5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb"} Apr 20 23:16:06.716908 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:06.716510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556c66679c-t75ds" event={"ID":"3ac8af9d-aaa7-4000-9535-2d4611bcad54","Type":"ContainerStarted","Data":"2d2dbacd775185d3b4a40beaf00eac341a7ce4296a73a7bd12ef8f6dfe399378"} Apr 20 23:16:06.734436 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:06.734386 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-556c66679c-t75ds" podStartSLOduration=1.73437265 podStartE2EDuration="1.73437265s" podCreationTimestamp="2026-04-20 23:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:16:06.733419103 +0000 UTC m=+214.534552799" watchObservedRunningTime="2026-04-20 23:16:06.73437265 +0000 UTC m=+214.535506345" Apr 20 23:16:15.742681 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:15.742650 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:15.742681 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:15.742687 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:15.747269 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:15.747246 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:16.750601 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:16.750565 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:16:16.799846 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:16.799820 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f6fc98c95-69mk5"] Apr 20 23:16:41.819131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:41.819063 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f6fc98c95-69mk5" podUID="9635d186-390b-461b-9cd8-869eec113618" containerName="console" containerID="cri-o://d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2" gracePeriod=15 Apr 20 23:16:42.058952 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.058920 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f6fc98c95-69mk5_9635d186-390b-461b-9cd8-869eec113618/console/0.log" Apr 20 23:16:42.059075 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.058994 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:16:42.146168 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146099 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-console-config\") pod \"9635d186-390b-461b-9cd8-869eec113618\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " Apr 20 23:16:42.146168 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146138 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-service-ca\") pod \"9635d186-390b-461b-9cd8-869eec113618\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " Apr 20 23:16:42.146168 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146165 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-oauth-config\") pod \"9635d186-390b-461b-9cd8-869eec113618\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " Apr 20 23:16:42.146378 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146282 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-serving-cert\") pod \"9635d186-390b-461b-9cd8-869eec113618\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " Apr 20 23:16:42.146378 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146312 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-trusted-ca-bundle\") pod \"9635d186-390b-461b-9cd8-869eec113618\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " Apr 20 23:16:42.146378 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146344 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjlcr\" (UniqueName: \"kubernetes.io/projected/9635d186-390b-461b-9cd8-869eec113618-kube-api-access-vjlcr\") pod \"9635d186-390b-461b-9cd8-869eec113618\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " Apr 20 23:16:42.146567 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146389 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-oauth-serving-cert\") pod \"9635d186-390b-461b-9cd8-869eec113618\" (UID: \"9635d186-390b-461b-9cd8-869eec113618\") " Apr 20 23:16:42.146662 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146534 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-console-config" (OuterVolumeSpecName: "console-config") pod "9635d186-390b-461b-9cd8-869eec113618" (UID: "9635d186-390b-461b-9cd8-869eec113618"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:16:42.146733 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146659 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-service-ca" (OuterVolumeSpecName: "service-ca") pod "9635d186-390b-461b-9cd8-869eec113618" (UID: "9635d186-390b-461b-9cd8-869eec113618"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:16:42.146733 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146716 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-console-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:16:42.146849 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.146730 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9635d186-390b-461b-9cd8-869eec113618" (UID: "9635d186-390b-461b-9cd8-869eec113618"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:16:42.147128 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.147103 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9635d186-390b-461b-9cd8-869eec113618" (UID: "9635d186-390b-461b-9cd8-869eec113618"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:16:42.148485 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.148442 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9635d186-390b-461b-9cd8-869eec113618" (UID: "9635d186-390b-461b-9cd8-869eec113618"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:16:42.148559 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.148484 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9635d186-390b-461b-9cd8-869eec113618" (UID: "9635d186-390b-461b-9cd8-869eec113618"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:16:42.148559 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.148499 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9635d186-390b-461b-9cd8-869eec113618-kube-api-access-vjlcr" (OuterVolumeSpecName: "kube-api-access-vjlcr") pod "9635d186-390b-461b-9cd8-869eec113618" (UID: "9635d186-390b-461b-9cd8-869eec113618"). InnerVolumeSpecName "kube-api-access-vjlcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:16:42.248004 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.247968 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-serving-cert\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:16:42.248004 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.247991 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-trusted-ca-bundle\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:16:42.248004 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.248001 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vjlcr\" (UniqueName: \"kubernetes.io/projected/9635d186-390b-461b-9cd8-869eec113618-kube-api-access-vjlcr\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:16:42.248004 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.248011 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-oauth-serving-cert\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:16:42.248235 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.248020 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9635d186-390b-461b-9cd8-869eec113618-service-ca\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:16:42.248235 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.248029 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9635d186-390b-461b-9cd8-869eec113618-console-oauth-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:16:42.822815 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.822792 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f6fc98c95-69mk5_9635d186-390b-461b-9cd8-869eec113618/console/0.log" Apr 20 23:16:42.823172 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.822830 2575 generic.go:358] "Generic (PLEG): container finished" podID="9635d186-390b-461b-9cd8-869eec113618" containerID="d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2" exitCode=2 Apr 20 23:16:42.823172 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.822863 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f6fc98c95-69mk5" event={"ID":"9635d186-390b-461b-9cd8-869eec113618","Type":"ContainerDied","Data":"d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2"} Apr 20 23:16:42.823172 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.822893 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f6fc98c95-69mk5" Apr 20 23:16:42.823172 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.822911 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f6fc98c95-69mk5" event={"ID":"9635d186-390b-461b-9cd8-869eec113618","Type":"ContainerDied","Data":"dffa7f0328b9002b134cb093ca6266731f8b9003142ba28cd139fbce494d6a5b"} Apr 20 23:16:42.823172 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.822936 2575 scope.go:117] "RemoveContainer" containerID="d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2" Apr 20 23:16:42.831056 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.831038 2575 scope.go:117] "RemoveContainer" containerID="d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2" Apr 20 23:16:42.831307 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:16:42.831290 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2\": container with ID starting with d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2 not found: ID does not exist" containerID="d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2" Apr 20 23:16:42.831357 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.831315 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2"} err="failed to get container status \"d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2\": rpc error: code = NotFound desc = could not find container \"d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2\": container with ID starting with d6f639e738026e98e4ed90d7a6e6f0bed9aa2c548b968a1eb0ae7310bfb9cbe2 not found: ID does not exist" Apr 20 23:16:42.841293 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.841265 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f6fc98c95-69mk5"] Apr 20 23:16:42.844565 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:42.844547 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f6fc98c95-69mk5"] Apr 20 23:16:44.782142 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:16:44.782111 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9635d186-390b-461b-9cd8-869eec113618" path="/var/lib/kubelet/pods/9635d186-390b-461b-9cd8-869eec113618/volumes" Apr 20 23:17:32.676635 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:17:32.676610 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 23:18:03.868633 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.868595 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4"] Apr 20 23:18:03.871012 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.868917 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9635d186-390b-461b-9cd8-869eec113618" containerName="console" Apr 20 23:18:03.871012 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.868927 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9635d186-390b-461b-9cd8-869eec113618" containerName="console" Apr 20 23:18:03.871012 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.868998 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9635d186-390b-461b-9cd8-869eec113618" containerName="console" Apr 20 23:18:03.871824 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.871808 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:03.880783 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.880765 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 23:18:03.881518 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.881503 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 23:18:03.883319 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.883307 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 23:18:03.883381 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.883357 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-c9lhx\"" Apr 20 23:18:03.883634 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.883621 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 23:18:03.910728 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.910703 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4"] Apr 20 23:18:03.966038 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.965998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db0c3048-a728-4289-96f2-bdb02f8643d1-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-gf9d4\" (UID: \"db0c3048-a728-4289-96f2-bdb02f8643d1\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:03.966183 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.966134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgt97\" (UniqueName: \"kubernetes.io/projected/db0c3048-a728-4289-96f2-bdb02f8643d1-kube-api-access-zgt97\") pod \"opendatahub-operator-controller-manager-5d79c565b7-gf9d4\" (UID: \"db0c3048-a728-4289-96f2-bdb02f8643d1\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:03.966240 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:03.966221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db0c3048-a728-4289-96f2-bdb02f8643d1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-gf9d4\" (UID: \"db0c3048-a728-4289-96f2-bdb02f8643d1\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:04.067642 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:04.067609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgt97\" (UniqueName: \"kubernetes.io/projected/db0c3048-a728-4289-96f2-bdb02f8643d1-kube-api-access-zgt97\") pod \"opendatahub-operator-controller-manager-5d79c565b7-gf9d4\" (UID: \"db0c3048-a728-4289-96f2-bdb02f8643d1\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:04.067642 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:04.067647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db0c3048-a728-4289-96f2-bdb02f8643d1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-gf9d4\" (UID: \"db0c3048-a728-4289-96f2-bdb02f8643d1\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:04.067859 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:04.067698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db0c3048-a728-4289-96f2-bdb02f8643d1-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-gf9d4\" (UID: \"db0c3048-a728-4289-96f2-bdb02f8643d1\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:04.070359 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:04.070335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db0c3048-a728-4289-96f2-bdb02f8643d1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-gf9d4\" (UID: \"db0c3048-a728-4289-96f2-bdb02f8643d1\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:04.070500 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:04.070437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db0c3048-a728-4289-96f2-bdb02f8643d1-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-gf9d4\" (UID: \"db0c3048-a728-4289-96f2-bdb02f8643d1\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:04.080020 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:04.079974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgt97\" (UniqueName: \"kubernetes.io/projected/db0c3048-a728-4289-96f2-bdb02f8643d1-kube-api-access-zgt97\") pod \"opendatahub-operator-controller-manager-5d79c565b7-gf9d4\" (UID: \"db0c3048-a728-4289-96f2-bdb02f8643d1\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:04.181453 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:04.181365 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:04.314820 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:04.314800 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4"] Apr 20 23:18:04.317440 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:18:04.317408 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb0c3048_a728_4289_96f2_bdb02f8643d1.slice/crio-9b520abd1fd78516e0a79622cf3ff168ea6ad78e745c11971b2b96e1e82d88e8 WatchSource:0}: Error finding container 9b520abd1fd78516e0a79622cf3ff168ea6ad78e745c11971b2b96e1e82d88e8: Status 404 returned error can't find the container with id 9b520abd1fd78516e0a79622cf3ff168ea6ad78e745c11971b2b96e1e82d88e8 Apr 20 23:18:04.319158 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:04.319140 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 23:18:05.064932 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:05.064891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" event={"ID":"db0c3048-a728-4289-96f2-bdb02f8643d1","Type":"ContainerStarted","Data":"9b520abd1fd78516e0a79622cf3ff168ea6ad78e745c11971b2b96e1e82d88e8"} Apr 20 23:18:08.076288 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:08.076255 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" event={"ID":"db0c3048-a728-4289-96f2-bdb02f8643d1","Type":"ContainerStarted","Data":"ee3b7b6b8ed26d0971c7a7057549506e16a2d7ba0020434de25963bceccf300e"} Apr 20 23:18:08.076668 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:08.076398 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:08.103706 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:08.103649 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" podStartSLOduration=2.209383533 podStartE2EDuration="5.103635181s" podCreationTimestamp="2026-04-20 23:18:03 +0000 UTC" firstStartedPulling="2026-04-20 23:18:04.319300716 +0000 UTC m=+332.120434390" lastFinishedPulling="2026-04-20 23:18:07.213552361 +0000 UTC m=+335.014686038" observedRunningTime="2026-04-20 23:18:08.102590776 +0000 UTC m=+335.903724519" watchObservedRunningTime="2026-04-20 23:18:08.103635181 +0000 UTC m=+335.904768877" Apr 20 23:18:19.081257 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:19.081227 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-gf9d4" Apr 20 23:18:22.246742 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.246670 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5"] Apr 20 23:18:22.249932 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.249913 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.253518 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.253496 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bw4cf\"" Apr 20 23:18:22.253635 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.253513 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 23:18:22.253635 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.253513 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 23:18:22.253635 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.253513 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 23:18:22.253635 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.253539 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 23:18:22.253855 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.253542 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:18:22.262366 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.262348 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5"] Apr 20 23:18:22.420663 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.420625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdjv9\" (UniqueName: \"kubernetes.io/projected/21e935b3-88f5-416d-947a-44adf920902d-kube-api-access-hdjv9\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.420663 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.420669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/21e935b3-88f5-416d-947a-44adf920902d-manager-config\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.420881 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.420694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/21e935b3-88f5-416d-947a-44adf920902d-metrics-cert\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.420881 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.420818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e935b3-88f5-416d-947a-44adf920902d-cert\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.521793 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.521722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e935b3-88f5-416d-947a-44adf920902d-cert\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.521793 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.521780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdjv9\" (UniqueName: \"kubernetes.io/projected/21e935b3-88f5-416d-947a-44adf920902d-kube-api-access-hdjv9\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.521966 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.521816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/21e935b3-88f5-416d-947a-44adf920902d-manager-config\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.521966 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.521850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/21e935b3-88f5-416d-947a-44adf920902d-metrics-cert\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.522428 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.522407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/21e935b3-88f5-416d-947a-44adf920902d-manager-config\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.524313 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.524293 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/21e935b3-88f5-416d-947a-44adf920902d-metrics-cert\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.524391 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.524344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e935b3-88f5-416d-947a-44adf920902d-cert\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.529488 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.529454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdjv9\" (UniqueName: \"kubernetes.io/projected/21e935b3-88f5-416d-947a-44adf920902d-kube-api-access-hdjv9\") pod \"lws-controller-manager-6577b568b8-lw2x5\" (UID: \"21e935b3-88f5-416d-947a-44adf920902d\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.559454 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.559430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:22.680365 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:22.680325 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5"] Apr 20 23:18:22.683082 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:18:22.683045 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e935b3_88f5_416d_947a_44adf920902d.slice/crio-338fda634cc20d265660bacabd9dcb6703c42dc0bef5a49a25b7294476ccd806 WatchSource:0}: Error finding container 338fda634cc20d265660bacabd9dcb6703c42dc0bef5a49a25b7294476ccd806: Status 404 returned error can't find the container with id 338fda634cc20d265660bacabd9dcb6703c42dc0bef5a49a25b7294476ccd806 Apr 20 23:18:23.123320 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:23.123285 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" event={"ID":"21e935b3-88f5-416d-947a-44adf920902d","Type":"ContainerStarted","Data":"338fda634cc20d265660bacabd9dcb6703c42dc0bef5a49a25b7294476ccd806"} Apr 20 23:18:26.135121 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:26.135088 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" event={"ID":"21e935b3-88f5-416d-947a-44adf920902d","Type":"ContainerStarted","Data":"af48faa3b3a783687c53876f6fb9bfca52f69df52cad353c806bf231b59ce5c6"} Apr 20 23:18:26.135521 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:26.135211 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:26.151983 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:26.151935 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" podStartSLOduration=1.504771125 podStartE2EDuration="4.151922949s" podCreationTimestamp="2026-04-20 23:18:22 +0000 UTC" firstStartedPulling="2026-04-20 23:18:22.684946537 +0000 UTC m=+350.486080212" lastFinishedPulling="2026-04-20 23:18:25.332098359 +0000 UTC m=+353.133232036" observedRunningTime="2026-04-20 23:18:26.150199096 +0000 UTC m=+353.951332797" watchObservedRunningTime="2026-04-20 23:18:26.151922949 +0000 UTC m=+353.953056645" Apr 20 23:18:33.584105 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.584074 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n"] Apr 20 23:18:33.586402 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.586388 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.588702 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.588663 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 23:18:33.589555 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.589537 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-5l92p\"" Apr 20 23:18:33.589674 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.589558 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 23:18:33.589674 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.589588 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 23:18:33.589674 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.589610 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 23:18:33.595206 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.595188 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n"] Apr 20 23:18:33.611760 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.611737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d040c532-c76a-4ff0-a162-340acb927c38-tls-certs\") pod \"kube-auth-proxy-74c79b5d98-pkd4n\" (UID: \"d040c532-c76a-4ff0-a162-340acb927c38\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.611856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.611773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqx6b\" (UniqueName: \"kubernetes.io/projected/d040c532-c76a-4ff0-a162-340acb927c38-kube-api-access-xqx6b\") pod \"kube-auth-proxy-74c79b5d98-pkd4n\" (UID: \"d040c532-c76a-4ff0-a162-340acb927c38\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.611856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.611805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d040c532-c76a-4ff0-a162-340acb927c38-tmp\") pod \"kube-auth-proxy-74c79b5d98-pkd4n\" (UID: \"d040c532-c76a-4ff0-a162-340acb927c38\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.712849 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.712827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d040c532-c76a-4ff0-a162-340acb927c38-tmp\") pod \"kube-auth-proxy-74c79b5d98-pkd4n\" (UID: \"d040c532-c76a-4ff0-a162-340acb927c38\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.712948 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.712895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d040c532-c76a-4ff0-a162-340acb927c38-tls-certs\") pod \"kube-auth-proxy-74c79b5d98-pkd4n\" (UID: \"d040c532-c76a-4ff0-a162-340acb927c38\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.712948 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.712929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqx6b\" (UniqueName: \"kubernetes.io/projected/d040c532-c76a-4ff0-a162-340acb927c38-kube-api-access-xqx6b\") pod \"kube-auth-proxy-74c79b5d98-pkd4n\" (UID: \"d040c532-c76a-4ff0-a162-340acb927c38\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.715208 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.715180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d040c532-c76a-4ff0-a162-340acb927c38-tmp\") pod \"kube-auth-proxy-74c79b5d98-pkd4n\" (UID: \"d040c532-c76a-4ff0-a162-340acb927c38\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.715377 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.715363 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d040c532-c76a-4ff0-a162-340acb927c38-tls-certs\") pod \"kube-auth-proxy-74c79b5d98-pkd4n\" (UID: \"d040c532-c76a-4ff0-a162-340acb927c38\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.720075 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.720059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqx6b\" (UniqueName: \"kubernetes.io/projected/d040c532-c76a-4ff0-a162-340acb927c38-kube-api-access-xqx6b\") pod \"kube-auth-proxy-74c79b5d98-pkd4n\" (UID: \"d040c532-c76a-4ff0-a162-340acb927c38\") " pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:33.896407 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:33.896356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" Apr 20 23:18:34.044238 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:34.044206 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n"] Apr 20 23:18:34.047144 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:18:34.047116 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd040c532_c76a_4ff0_a162_340acb927c38.slice/crio-6a074040fd494593d83d2b6d9207af3d2e9fea9d48772b78a09a098daf67d8f9 WatchSource:0}: Error finding container 6a074040fd494593d83d2b6d9207af3d2e9fea9d48772b78a09a098daf67d8f9: Status 404 returned error can't find the container with id 6a074040fd494593d83d2b6d9207af3d2e9fea9d48772b78a09a098daf67d8f9 Apr 20 23:18:34.164237 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:34.164156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" event={"ID":"d040c532-c76a-4ff0-a162-340acb927c38","Type":"ContainerStarted","Data":"6a074040fd494593d83d2b6d9207af3d2e9fea9d48772b78a09a098daf67d8f9"} Apr 20 23:18:37.142444 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:37.142407 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-lw2x5" Apr 20 23:18:38.180214 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:38.180179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" event={"ID":"d040c532-c76a-4ff0-a162-340acb927c38","Type":"ContainerStarted","Data":"f7d7225fde262099d663a77a9301f4a06d9585355d49276a82a10eb57bf49279"} Apr 20 23:18:38.196542 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:18:38.196500 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-74c79b5d98-pkd4n" podStartSLOduration=1.8387259889999998 podStartE2EDuration="5.196486275s" podCreationTimestamp="2026-04-20 23:18:33 +0000 UTC" firstStartedPulling="2026-04-20 23:18:34.049287152 +0000 UTC m=+361.850420840" lastFinishedPulling="2026-04-20 23:18:37.407047451 +0000 UTC m=+365.208181126" observedRunningTime="2026-04-20 23:18:38.194994706 +0000 UTC m=+365.996128402" watchObservedRunningTime="2026-04-20 23:18:38.196486275 +0000 UTC m=+365.997619962" Apr 20 23:20:02.717021 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.716987 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b85998796-zznzl"] Apr 20 23:20:02.720344 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.720323 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.744233 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.744205 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b85998796-zznzl"] Apr 20 23:20:02.824231 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.824207 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-oauth-serving-cert\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.824352 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.824250 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-service-ca\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.824352 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.824270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52c3fd37-280c-4e4e-8860-72822106bc7a-console-oauth-config\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.824352 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.824288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-console-config\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.824545 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.824380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4446\" (UniqueName: \"kubernetes.io/projected/52c3fd37-280c-4e4e-8860-72822106bc7a-kube-api-access-b4446\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.824545 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.824432 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-trusted-ca-bundle\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.824545 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.824492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52c3fd37-280c-4e4e-8860-72822106bc7a-console-serving-cert\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.925536 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.925508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-oauth-serving-cert\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.925646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.925565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-service-ca\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.925646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.925597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52c3fd37-280c-4e4e-8860-72822106bc7a-console-oauth-config\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.925646 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.925620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-console-config\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.925795 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.925657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4446\" (UniqueName: \"kubernetes.io/projected/52c3fd37-280c-4e4e-8860-72822106bc7a-kube-api-access-b4446\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.925851 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.925816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-trusted-ca-bundle\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.925901 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.925861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52c3fd37-280c-4e4e-8860-72822106bc7a-console-serving-cert\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.926283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.926253 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-oauth-serving-cert\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.926365 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.926324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-service-ca\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.926365 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.926324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-console-config\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.927026 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.927002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52c3fd37-280c-4e4e-8860-72822106bc7a-trusted-ca-bundle\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.928246 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.928225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52c3fd37-280c-4e4e-8860-72822106bc7a-console-oauth-config\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.928611 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.928594 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52c3fd37-280c-4e4e-8860-72822106bc7a-console-serving-cert\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:02.935267 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:02.935248 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4446\" (UniqueName: \"kubernetes.io/projected/52c3fd37-280c-4e4e-8860-72822106bc7a-kube-api-access-b4446\") pod \"console-5b85998796-zznzl\" (UID: \"52c3fd37-280c-4e4e-8860-72822106bc7a\") " pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:03.029901 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:03.029839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:03.151479 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:03.151444 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b85998796-zznzl"] Apr 20 23:20:03.153915 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:20:03.153891 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52c3fd37_280c_4e4e_8860_72822106bc7a.slice/crio-4aae3524958f83a4182b487927ecc8d4f247ef47289e7b65df8dc443fe730deb WatchSource:0}: Error finding container 4aae3524958f83a4182b487927ecc8d4f247ef47289e7b65df8dc443fe730deb: Status 404 returned error can't find the container with id 4aae3524958f83a4182b487927ecc8d4f247ef47289e7b65df8dc443fe730deb Apr 20 23:20:03.463636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:03.463599 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b85998796-zznzl" event={"ID":"52c3fd37-280c-4e4e-8860-72822106bc7a","Type":"ContainerStarted","Data":"f205219fe70106d576b3722f10d23551de8165a81c23179d509e857d1a0057e5"} Apr 20 23:20:03.463799 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:03.463642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b85998796-zznzl" event={"ID":"52c3fd37-280c-4e4e-8860-72822106bc7a","Type":"ContainerStarted","Data":"4aae3524958f83a4182b487927ecc8d4f247ef47289e7b65df8dc443fe730deb"} Apr 20 23:20:03.486690 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:03.486649 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b85998796-zznzl" podStartSLOduration=1.486635159 podStartE2EDuration="1.486635159s" podCreationTimestamp="2026-04-20 23:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:20:03.485336931 +0000 UTC m=+451.286470631" watchObservedRunningTime="2026-04-20 23:20:03.486635159 +0000 UTC m=+451.287768854" Apr 20 23:20:06.171393 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.171357 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4"] Apr 20 23:20:06.175359 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.175343 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" Apr 20 23:20:06.177644 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.177624 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 23:20:06.178384 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.178356 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 23:20:06.178384 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.178370 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 23:20:06.178554 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.178370 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-92rw6\"" Apr 20 23:20:06.184495 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.184454 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4"] Apr 20 23:20:06.256263 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.256230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwsm\" (UniqueName: \"kubernetes.io/projected/77b407ed-9def-48e7-af38-280d038271ff-kube-api-access-hhwsm\") pod \"dns-operator-controller-manager-648d5c98bc-lb4t4\" (UID: \"77b407ed-9def-48e7-af38-280d038271ff\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" Apr 20 23:20:06.357713 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.357676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwsm\" (UniqueName: \"kubernetes.io/projected/77b407ed-9def-48e7-af38-280d038271ff-kube-api-access-hhwsm\") pod \"dns-operator-controller-manager-648d5c98bc-lb4t4\" (UID: \"77b407ed-9def-48e7-af38-280d038271ff\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" Apr 20 23:20:06.366033 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.366011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwsm\" (UniqueName: \"kubernetes.io/projected/77b407ed-9def-48e7-af38-280d038271ff-kube-api-access-hhwsm\") pod \"dns-operator-controller-manager-648d5c98bc-lb4t4\" (UID: \"77b407ed-9def-48e7-af38-280d038271ff\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" Apr 20 23:20:06.487219 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.487150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" Apr 20 23:20:06.613733 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:06.613694 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4"] Apr 20 23:20:06.616101 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:20:06.616075 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77b407ed_9def_48e7_af38_280d038271ff.slice/crio-27eeb032eb5bc4d24000cce12a834279090484d605b241bb72a90ba16e1517da WatchSource:0}: Error finding container 27eeb032eb5bc4d24000cce12a834279090484d605b241bb72a90ba16e1517da: Status 404 returned error can't find the container with id 27eeb032eb5bc4d24000cce12a834279090484d605b241bb72a90ba16e1517da Apr 20 23:20:07.478387 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:07.478353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" event={"ID":"77b407ed-9def-48e7-af38-280d038271ff","Type":"ContainerStarted","Data":"27eeb032eb5bc4d24000cce12a834279090484d605b241bb72a90ba16e1517da"} Apr 20 23:20:09.487991 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:09.487944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" event={"ID":"77b407ed-9def-48e7-af38-280d038271ff","Type":"ContainerStarted","Data":"18a137485f4ed92c203165e13e1b2132afc0c39faaae57c93609a17df7c2a077"} Apr 20 23:20:09.488372 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:09.488094 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" Apr 20 23:20:09.524642 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:09.524588 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" podStartSLOduration=1.4081469420000001 podStartE2EDuration="3.524575779s" podCreationTimestamp="2026-04-20 23:20:06 +0000 UTC" firstStartedPulling="2026-04-20 23:20:06.617963314 +0000 UTC m=+454.419096988" lastFinishedPulling="2026-04-20 23:20:08.734392147 +0000 UTC m=+456.535525825" observedRunningTime="2026-04-20 23:20:09.523293979 +0000 UTC m=+457.324427688" watchObservedRunningTime="2026-04-20 23:20:09.524575779 +0000 UTC m=+457.325709475" Apr 20 23:20:13.030691 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:13.030658 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:13.030691 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:13.030699 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:13.035113 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:13.035095 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:13.508706 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:13.508677 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b85998796-zznzl" Apr 20 23:20:13.555062 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:13.555036 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-556c66679c-t75ds"] Apr 20 23:20:19.139422 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.139390 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc"] Apr 20 23:20:19.142743 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.142728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" Apr 20 23:20:19.145146 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.145124 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ws4dn\"" Apr 20 23:20:19.156482 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.156438 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc"] Apr 20 23:20:19.243283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.243250 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc"] Apr 20 23:20:19.243530 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:20:19.243511 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-zqbkl], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" podUID="05b9f1b3-4888-4fcc-9e45-37e49440e9ad" Apr 20 23:20:19.279524 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.279492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqbkl\" (UniqueName: \"kubernetes.io/projected/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-kube-api-access-zqbkl\") pod \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" (UID: \"05b9f1b3-4888-4fcc-9e45-37e49440e9ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" Apr 20 23:20:19.279632 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.279562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" (UID: \"05b9f1b3-4888-4fcc-9e45-37e49440e9ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" Apr 20 23:20:19.380264 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.380237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" (UID: \"05b9f1b3-4888-4fcc-9e45-37e49440e9ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" Apr 20 23:20:19.380415 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.380309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqbkl\" (UniqueName: \"kubernetes.io/projected/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-kube-api-access-zqbkl\") pod \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" (UID: \"05b9f1b3-4888-4fcc-9e45-37e49440e9ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" Apr 20 23:20:19.380701 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.380677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" (UID: \"05b9f1b3-4888-4fcc-9e45-37e49440e9ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" Apr 20 23:20:19.386104 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.386075 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx"] Apr 20 23:20:19.389382 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.389362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:19.391640 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.391594 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc"] Apr 20 23:20:19.397702 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:20:19.397682 2575 projected.go:194] Error preparing data for projected volume kube-api-access-zqbkl for pod kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc: failed to fetch token: pods "kuadrant-operator-controller-manager-84b657d985-hqpfc" not found Apr 20 23:20:19.397827 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:20:19.397765 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-kube-api-access-zqbkl podName:05b9f1b3-4888-4fcc-9e45-37e49440e9ad nodeName:}" failed. No retries permitted until 2026-04-20 23:20:19.897746283 +0000 UTC m=+467.698879979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zqbkl" (UniqueName: "kubernetes.io/projected/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-kube-api-access-zqbkl") pod "kuadrant-operator-controller-manager-84b657d985-hqpfc" (UID: "05b9f1b3-4888-4fcc-9e45-37e49440e9ad") : failed to fetch token: pods "kuadrant-operator-controller-manager-84b657d985-hqpfc" not found Apr 20 23:20:19.398690 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.398146 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc"] Apr 20 23:20:19.402909 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.402890 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx"] Apr 20 23:20:19.481575 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.481547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ggjfx\" (UID: \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:19.481711 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.481627 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpcnn\" (UniqueName: \"kubernetes.io/projected/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-kube-api-access-wpcnn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ggjfx\" (UID: \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:19.524418 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.524393 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" Apr 20 23:20:19.526557 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.526526 2575 status_manager.go:895] "Failed to get status for pod" podUID="05b9f1b3-4888-4fcc-9e45-37e49440e9ad" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" err="pods \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" is forbidden: User \"system:node:ip-10-0-131-251.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-251.ec2.internal' and this object" Apr 20 23:20:19.528894 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.528876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" Apr 20 23:20:19.530857 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.530828 2575 status_manager.go:895] "Failed to get status for pod" podUID="05b9f1b3-4888-4fcc-9e45-37e49440e9ad" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" err="pods \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" is forbidden: User \"system:node:ip-10-0-131-251.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-251.ec2.internal' and this object" Apr 20 23:20:19.582996 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.582974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ggjfx\" (UID: \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:19.583088 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.583025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpcnn\" (UniqueName: \"kubernetes.io/projected/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-kube-api-access-wpcnn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ggjfx\" (UID: \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:19.583318 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.583297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ggjfx\" (UID: \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:19.590952 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.590927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpcnn\" (UniqueName: \"kubernetes.io/projected/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-kube-api-access-wpcnn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ggjfx\" (UID: \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:19.684194 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.684142 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-extensions-socket-volume\") pod \"05b9f1b3-4888-4fcc-9e45-37e49440e9ad\" (UID: \"05b9f1b3-4888-4fcc-9e45-37e49440e9ad\") " Apr 20 23:20:19.684307 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.684288 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zqbkl\" (UniqueName: \"kubernetes.io/projected/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-kube-api-access-zqbkl\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:19.684406 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.684386 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "05b9f1b3-4888-4fcc-9e45-37e49440e9ad" (UID: "05b9f1b3-4888-4fcc-9e45-37e49440e9ad"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:20:19.703721 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.703703 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:19.785159 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.785128 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/05b9f1b3-4888-4fcc-9e45-37e49440e9ad-extensions-socket-volume\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:19.828087 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:19.828054 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx"] Apr 20 23:20:19.831995 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:20:19.831957 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbff8dbe_f846_46d0_8bdf_b363bc4cd2ba.slice/crio-108f96b8c3565e2ff90bf0a760e819f75b85fcdd64f69335b15d824898aa9638 WatchSource:0}: Error finding container 108f96b8c3565e2ff90bf0a760e819f75b85fcdd64f69335b15d824898aa9638: Status 404 returned error can't find the container with id 108f96b8c3565e2ff90bf0a760e819f75b85fcdd64f69335b15d824898aa9638 Apr 20 23:20:20.493987 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:20.493957 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-lb4t4" Apr 20 23:20:20.496178 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:20.496139 2575 status_manager.go:895] "Failed to get status for pod" podUID="05b9f1b3-4888-4fcc-9e45-37e49440e9ad" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" err="pods \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" is forbidden: User \"system:node:ip-10-0-131-251.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-251.ec2.internal' and this object" Apr 20 23:20:20.529915 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:20.529889 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" Apr 20 23:20:20.530197 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:20.529901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" event={"ID":"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba","Type":"ContainerStarted","Data":"108f96b8c3565e2ff90bf0a760e819f75b85fcdd64f69335b15d824898aa9638"} Apr 20 23:20:20.532009 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:20.531971 2575 status_manager.go:895] "Failed to get status for pod" podUID="05b9f1b3-4888-4fcc-9e45-37e49440e9ad" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" err="pods \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" is forbidden: User \"system:node:ip-10-0-131-251.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-251.ec2.internal' and this object" Apr 20 23:20:20.538711 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:20.538681 2575 status_manager.go:895] "Failed to get status for pod" podUID="05b9f1b3-4888-4fcc-9e45-37e49440e9ad" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-hqpfc" err="pods \"kuadrant-operator-controller-manager-84b657d985-hqpfc\" is forbidden: User \"system:node:ip-10-0-131-251.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-131-251.ec2.internal' and this object" Apr 20 23:20:20.784827 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:20.784761 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b9f1b3-4888-4fcc-9e45-37e49440e9ad" path="/var/lib/kubelet/pods/05b9f1b3-4888-4fcc-9e45-37e49440e9ad/volumes" Apr 20 23:20:24.546524 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:24.546483 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" event={"ID":"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba","Type":"ContainerStarted","Data":"5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2"} Apr 20 23:20:24.546891 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:24.546592 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:24.565035 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:24.564994 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" podStartSLOduration=1.7212992310000002 podStartE2EDuration="5.564982243s" podCreationTimestamp="2026-04-20 23:20:19 +0000 UTC" firstStartedPulling="2026-04-20 23:20:19.834352317 +0000 UTC m=+467.635485991" lastFinishedPulling="2026-04-20 23:20:23.678035328 +0000 UTC m=+471.479169003" observedRunningTime="2026-04-20 23:20:24.563522393 +0000 UTC m=+472.364656088" watchObservedRunningTime="2026-04-20 23:20:24.564982243 +0000 UTC m=+472.366115939" Apr 20 23:20:35.551628 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:35.551597 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:38.575361 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.575325 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-556c66679c-t75ds" podUID="3ac8af9d-aaa7-4000-9535-2d4611bcad54" containerName="console" containerID="cri-o://5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb" gracePeriod=15 Apr 20 23:20:38.819582 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.819561 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-556c66679c-t75ds_3ac8af9d-aaa7-4000-9535-2d4611bcad54/console/0.log" Apr 20 23:20:38.819703 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.819618 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:20:38.943089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943021 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-oauth-serving-cert\") pod \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " Apr 20 23:20:38.943089 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943061 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-config\") pod \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " Apr 20 23:20:38.943281 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943091 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-oauth-config\") pod \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " Apr 20 23:20:38.943281 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943115 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-serving-cert\") pod \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " Apr 20 23:20:38.943281 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943183 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-trusted-ca-bundle\") pod \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " Apr 20 23:20:38.943281 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943254 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-service-ca\") pod \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " Apr 20 23:20:38.943510 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943280 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2mr6\" (UniqueName: \"kubernetes.io/projected/3ac8af9d-aaa7-4000-9535-2d4611bcad54-kube-api-access-g2mr6\") pod \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\" (UID: \"3ac8af9d-aaa7-4000-9535-2d4611bcad54\") " Apr 20 23:20:38.943510 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943385 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3ac8af9d-aaa7-4000-9535-2d4611bcad54" (UID: "3ac8af9d-aaa7-4000-9535-2d4611bcad54"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:20:38.943625 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943548 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-config" (OuterVolumeSpecName: "console-config") pod "3ac8af9d-aaa7-4000-9535-2d4611bcad54" (UID: "3ac8af9d-aaa7-4000-9535-2d4611bcad54"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:20:38.943625 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943593 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-service-ca" (OuterVolumeSpecName: "service-ca") pod "3ac8af9d-aaa7-4000-9535-2d4611bcad54" (UID: "3ac8af9d-aaa7-4000-9535-2d4611bcad54"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:20:38.943722 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943633 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3ac8af9d-aaa7-4000-9535-2d4611bcad54" (UID: "3ac8af9d-aaa7-4000-9535-2d4611bcad54"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:20:38.943777 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943758 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-trusted-ca-bundle\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:38.943830 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943777 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-service-ca\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:38.943830 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943795 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-oauth-serving-cert\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:38.943830 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.943810 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:38.945720 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.945695 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac8af9d-aaa7-4000-9535-2d4611bcad54-kube-api-access-g2mr6" (OuterVolumeSpecName: "kube-api-access-g2mr6") pod "3ac8af9d-aaa7-4000-9535-2d4611bcad54" (UID: "3ac8af9d-aaa7-4000-9535-2d4611bcad54"). InnerVolumeSpecName "kube-api-access-g2mr6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:20:38.945897 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.945879 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3ac8af9d-aaa7-4000-9535-2d4611bcad54" (UID: "3ac8af9d-aaa7-4000-9535-2d4611bcad54"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:20:38.945989 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:38.945971 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3ac8af9d-aaa7-4000-9535-2d4611bcad54" (UID: "3ac8af9d-aaa7-4000-9535-2d4611bcad54"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 23:20:39.044709 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.044666 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-oauth-config\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:39.044709 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.044699 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ac8af9d-aaa7-4000-9535-2d4611bcad54-console-serving-cert\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:39.044709 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.044714 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2mr6\" (UniqueName: \"kubernetes.io/projected/3ac8af9d-aaa7-4000-9535-2d4611bcad54-kube-api-access-g2mr6\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:39.600215 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.600188 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-556c66679c-t75ds_3ac8af9d-aaa7-4000-9535-2d4611bcad54/console/0.log" Apr 20 23:20:39.600620 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.600228 2575 generic.go:358] "Generic (PLEG): container finished" podID="3ac8af9d-aaa7-4000-9535-2d4611bcad54" containerID="5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb" exitCode=2 Apr 20 23:20:39.600620 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.600304 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-556c66679c-t75ds" Apr 20 23:20:39.600620 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.600309 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556c66679c-t75ds" event={"ID":"3ac8af9d-aaa7-4000-9535-2d4611bcad54","Type":"ContainerDied","Data":"5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb"} Apr 20 23:20:39.600620 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.600342 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-556c66679c-t75ds" event={"ID":"3ac8af9d-aaa7-4000-9535-2d4611bcad54","Type":"ContainerDied","Data":"2d2dbacd775185d3b4a40beaf00eac341a7ce4296a73a7bd12ef8f6dfe399378"} Apr 20 23:20:39.600620 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.600365 2575 scope.go:117] "RemoveContainer" containerID="5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb" Apr 20 23:20:39.609127 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.609107 2575 scope.go:117] "RemoveContainer" containerID="5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb" Apr 20 23:20:39.609340 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:20:39.609322 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb\": container with ID starting with 5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb not found: ID does not exist" containerID="5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb" Apr 20 23:20:39.609383 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.609350 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb"} err="failed to get container status \"5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb\": rpc error: code = NotFound desc = could not find container \"5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb\": container with ID starting with 5a1a3dd6453c15b80313a61316070b5a2104b929003ac208973e4496d12fadfb not found: ID does not exist" Apr 20 23:20:39.621878 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.621823 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-556c66679c-t75ds"] Apr 20 23:20:39.623565 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:39.623548 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-556c66679c-t75ds"] Apr 20 23:20:40.783101 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:40.783066 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac8af9d-aaa7-4000-9535-2d4611bcad54" path="/var/lib/kubelet/pods/3ac8af9d-aaa7-4000-9535-2d4611bcad54/volumes" Apr 20 23:20:43.624678 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:43.624649 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx"] Apr 20 23:20:43.625131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:43.624931 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" podUID="cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba" containerName="manager" containerID="cri-o://5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2" gracePeriod=10 Apr 20 23:20:44.271301 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.271279 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:44.284605 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.284585 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-extensions-socket-volume\") pod \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\" (UID: \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\") " Apr 20 23:20:44.284705 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.284626 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpcnn\" (UniqueName: \"kubernetes.io/projected/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-kube-api-access-wpcnn\") pod \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\" (UID: \"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba\") " Apr 20 23:20:44.285009 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.284983 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba" (UID: "cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:20:44.286801 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.286779 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-kube-api-access-wpcnn" (OuterVolumeSpecName: "kube-api-access-wpcnn") pod "cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba" (UID: "cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba"). InnerVolumeSpecName "kube-api-access-wpcnn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:20:44.385153 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.385125 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wpcnn\" (UniqueName: \"kubernetes.io/projected/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-kube-api-access-wpcnn\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:44.385153 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.385148 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba-extensions-socket-volume\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:20:44.618920 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.618889 2575 generic.go:358] "Generic (PLEG): container finished" podID="cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba" containerID="5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2" exitCode=0 Apr 20 23:20:44.619070 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.618950 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" Apr 20 23:20:44.619070 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.618962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" event={"ID":"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba","Type":"ContainerDied","Data":"5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2"} Apr 20 23:20:44.619070 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.618991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx" event={"ID":"cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba","Type":"ContainerDied","Data":"108f96b8c3565e2ff90bf0a760e819f75b85fcdd64f69335b15d824898aa9638"} Apr 20 23:20:44.619070 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.619005 2575 scope.go:117] "RemoveContainer" containerID="5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2" Apr 20 23:20:44.627796 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.627635 2575 scope.go:117] "RemoveContainer" containerID="5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2" Apr 20 23:20:44.627999 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:20:44.627876 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2\": container with ID starting with 5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2 not found: ID does not exist" containerID="5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2" Apr 20 23:20:44.627999 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.627897 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2"} err="failed to get container status \"5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2\": rpc error: code = NotFound desc = could not find container \"5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2\": container with ID starting with 5a716bd7d5b2432e9f227fab0869d4fecc84eee09268b4fecfd01a11657da2b2 not found: ID does not exist" Apr 20 23:20:44.648432 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.648408 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx"] Apr 20 23:20:44.658507 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.658485 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ggjfx"] Apr 20 23:20:44.782338 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:20:44.782314 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba" path="/var/lib/kubelet/pods/cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba/volumes" Apr 20 23:22:05.959897 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.959860 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-68c4fbbd6f-5zhbk"] Apr 20 23:22:05.960432 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.960300 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba" containerName="manager" Apr 20 23:22:05.960432 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.960316 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba" containerName="manager" Apr 20 23:22:05.960432 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.960333 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ac8af9d-aaa7-4000-9535-2d4611bcad54" containerName="console" Apr 20 23:22:05.960432 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.960339 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac8af9d-aaa7-4000-9535-2d4611bcad54" containerName="console" Apr 20 23:22:05.960432 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.960419 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ac8af9d-aaa7-4000-9535-2d4611bcad54" containerName="console" Apr 20 23:22:05.960432 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.960432 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbff8dbe-f846-46d0-8bdf-b363bc4cd2ba" containerName="manager" Apr 20 23:22:05.963670 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.963649 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:22:05.965791 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.965769 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 23:22:05.966026 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.966011 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-mmwvk\"" Apr 20 23:22:05.972491 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:05.972450 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-68c4fbbd6f-5zhbk"] Apr 20 23:22:06.075744 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:06.075714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2e117427-3bf5-420d-8c87-a22a6a6a543a-data\") pod \"postgres-68c4fbbd6f-5zhbk\" (UID: \"2e117427-3bf5-420d-8c87-a22a6a6a543a\") " pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:22:06.075918 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:06.075795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kczz4\" (UniqueName: \"kubernetes.io/projected/2e117427-3bf5-420d-8c87-a22a6a6a543a-kube-api-access-kczz4\") pod \"postgres-68c4fbbd6f-5zhbk\" (UID: \"2e117427-3bf5-420d-8c87-a22a6a6a543a\") " pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:22:06.176832 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:06.176798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kczz4\" (UniqueName: \"kubernetes.io/projected/2e117427-3bf5-420d-8c87-a22a6a6a543a-kube-api-access-kczz4\") pod \"postgres-68c4fbbd6f-5zhbk\" (UID: \"2e117427-3bf5-420d-8c87-a22a6a6a543a\") " pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:22:06.176992 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:06.176862 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2e117427-3bf5-420d-8c87-a22a6a6a543a-data\") pod \"postgres-68c4fbbd6f-5zhbk\" (UID: \"2e117427-3bf5-420d-8c87-a22a6a6a543a\") " pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:22:06.177224 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:06.177196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2e117427-3bf5-420d-8c87-a22a6a6a543a-data\") pod \"postgres-68c4fbbd6f-5zhbk\" (UID: \"2e117427-3bf5-420d-8c87-a22a6a6a543a\") " pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:22:06.185738 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:06.185713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kczz4\" (UniqueName: \"kubernetes.io/projected/2e117427-3bf5-420d-8c87-a22a6a6a543a-kube-api-access-kczz4\") pod \"postgres-68c4fbbd6f-5zhbk\" (UID: \"2e117427-3bf5-420d-8c87-a22a6a6a543a\") " pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:22:06.274745 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:06.274682 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:22:06.394733 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:06.394706 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-68c4fbbd6f-5zhbk"] Apr 20 23:22:06.397447 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:22:06.397420 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e117427_3bf5_420d_8c87_a22a6a6a543a.slice/crio-b8c803a5ce97a583c8d2e23b2e6e94cdba5ae93f4ce52f41334e1df21c7c9a37 WatchSource:0}: Error finding container b8c803a5ce97a583c8d2e23b2e6e94cdba5ae93f4ce52f41334e1df21c7c9a37: Status 404 returned error can't find the container with id b8c803a5ce97a583c8d2e23b2e6e94cdba5ae93f4ce52f41334e1df21c7c9a37 Apr 20 23:22:06.885200 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:06.885165 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-68c4fbbd6f-5zhbk" event={"ID":"2e117427-3bf5-420d-8c87-a22a6a6a543a","Type":"ContainerStarted","Data":"b8c803a5ce97a583c8d2e23b2e6e94cdba5ae93f4ce52f41334e1df21c7c9a37"} Apr 20 23:22:11.905058 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:11.905018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-68c4fbbd6f-5zhbk" event={"ID":"2e117427-3bf5-420d-8c87-a22a6a6a543a","Type":"ContainerStarted","Data":"9fe23fd711490b15fa73cc1905ba54558a7e0d0f7c9e393a34254d14cb931683"} Apr 20 23:22:11.905416 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:11.905154 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:22:11.920236 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:11.920188 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-68c4fbbd6f-5zhbk" podStartSLOduration=1.980297358 podStartE2EDuration="6.920174471s" podCreationTimestamp="2026-04-20 23:22:05 +0000 UTC" firstStartedPulling="2026-04-20 23:22:06.398675048 +0000 UTC m=+574.199808725" lastFinishedPulling="2026-04-20 23:22:11.33855216 +0000 UTC m=+579.139685838" observedRunningTime="2026-04-20 23:22:11.918870719 +0000 UTC m=+579.720004417" watchObservedRunningTime="2026-04-20 23:22:11.920174471 +0000 UTC m=+579.721308167" Apr 20 23:22:17.937942 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:22:17.937909 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-68c4fbbd6f-5zhbk" Apr 20 23:27:26.450164 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.450078 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-t9xpr"] Apr 20 23:27:26.453695 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.453675 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" Apr 20 23:27:26.456343 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.456324 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-vgvsr\"" Apr 20 23:27:26.480810 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.480790 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-t9xpr"] Apr 20 23:27:26.481025 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.481001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsljx\" (UniqueName: \"kubernetes.io/projected/3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a-kube-api-access-tsljx\") pod \"maas-controller-6d4c8f55f9-t9xpr\" (UID: \"3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a\") " pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" Apr 20 23:27:26.582541 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.582516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsljx\" (UniqueName: \"kubernetes.io/projected/3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a-kube-api-access-tsljx\") pod \"maas-controller-6d4c8f55f9-t9xpr\" (UID: \"3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a\") " pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" Apr 20 23:27:26.586293 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.586274 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5c65bd7f55-n2dww"] Apr 20 23:27:26.589511 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.589497 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c65bd7f55-n2dww" Apr 20 23:27:26.597083 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.597059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsljx\" (UniqueName: \"kubernetes.io/projected/3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a-kube-api-access-tsljx\") pod \"maas-controller-6d4c8f55f9-t9xpr\" (UID: \"3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a\") " pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" Apr 20 23:27:26.600752 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.600733 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5c65bd7f55-n2dww"] Apr 20 23:27:26.683681 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.683654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fvg6\" (UniqueName: \"kubernetes.io/projected/53bbaab9-de56-45bd-bb8b-4325c2b35fe5-kube-api-access-5fvg6\") pod \"maas-controller-5c65bd7f55-n2dww\" (UID: \"53bbaab9-de56-45bd-bb8b-4325c2b35fe5\") " pod="opendatahub/maas-controller-5c65bd7f55-n2dww" Apr 20 23:27:26.706134 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.706067 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5c65bd7f55-n2dww"] Apr 20 23:27:26.706323 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:27:26.706301 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5fvg6], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-5c65bd7f55-n2dww" podUID="53bbaab9-de56-45bd-bb8b-4325c2b35fe5" Apr 20 23:27:26.730911 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.730887 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-56b647664b-vm4d8"] Apr 20 23:27:26.734345 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.734330 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-56b647664b-vm4d8" Apr 20 23:27:26.742593 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.742570 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-56b647664b-vm4d8"] Apr 20 23:27:26.775816 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.775791 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" Apr 20 23:27:26.784454 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.784419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fvg6\" (UniqueName: \"kubernetes.io/projected/53bbaab9-de56-45bd-bb8b-4325c2b35fe5-kube-api-access-5fvg6\") pod \"maas-controller-5c65bd7f55-n2dww\" (UID: \"53bbaab9-de56-45bd-bb8b-4325c2b35fe5\") " pod="opendatahub/maas-controller-5c65bd7f55-n2dww" Apr 20 23:27:26.784584 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.784567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tccqv\" (UniqueName: \"kubernetes.io/projected/d69653a3-74c0-43ae-a4b4-686e823d8a83-kube-api-access-tccqv\") pod \"maas-controller-56b647664b-vm4d8\" (UID: \"d69653a3-74c0-43ae-a4b4-686e823d8a83\") " pod="opendatahub/maas-controller-56b647664b-vm4d8" Apr 20 23:27:26.793147 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.793090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fvg6\" (UniqueName: \"kubernetes.io/projected/53bbaab9-de56-45bd-bb8b-4325c2b35fe5-kube-api-access-5fvg6\") pod \"maas-controller-5c65bd7f55-n2dww\" (UID: \"53bbaab9-de56-45bd-bb8b-4325c2b35fe5\") " pod="opendatahub/maas-controller-5c65bd7f55-n2dww" Apr 20 23:27:26.885423 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.885391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tccqv\" (UniqueName: \"kubernetes.io/projected/d69653a3-74c0-43ae-a4b4-686e823d8a83-kube-api-access-tccqv\") pod \"maas-controller-56b647664b-vm4d8\" (UID: \"d69653a3-74c0-43ae-a4b4-686e823d8a83\") " pod="opendatahub/maas-controller-56b647664b-vm4d8" Apr 20 23:27:26.895142 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.895117 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tccqv\" (UniqueName: \"kubernetes.io/projected/d69653a3-74c0-43ae-a4b4-686e823d8a83-kube-api-access-tccqv\") pod \"maas-controller-56b647664b-vm4d8\" (UID: \"d69653a3-74c0-43ae-a4b4-686e823d8a83\") " pod="opendatahub/maas-controller-56b647664b-vm4d8" Apr 20 23:27:26.966285 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.966227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c65bd7f55-n2dww" Apr 20 23:27:26.971219 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:26.971199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c65bd7f55-n2dww" Apr 20 23:27:27.045134 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.045105 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-56b647664b-vm4d8" Apr 20 23:27:27.087159 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.087135 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fvg6\" (UniqueName: \"kubernetes.io/projected/53bbaab9-de56-45bd-bb8b-4325c2b35fe5-kube-api-access-5fvg6\") pod \"53bbaab9-de56-45bd-bb8b-4325c2b35fe5\" (UID: \"53bbaab9-de56-45bd-bb8b-4325c2b35fe5\") " Apr 20 23:27:27.089222 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.089192 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bbaab9-de56-45bd-bb8b-4325c2b35fe5-kube-api-access-5fvg6" (OuterVolumeSpecName: "kube-api-access-5fvg6") pod "53bbaab9-de56-45bd-bb8b-4325c2b35fe5" (UID: "53bbaab9-de56-45bd-bb8b-4325c2b35fe5"). InnerVolumeSpecName "kube-api-access-5fvg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:27:27.112232 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.112207 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-t9xpr"] Apr 20 23:27:27.114320 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:27:27.114290 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cd0e51f_af2f_43d2_919c_b0b0cd84aa1a.slice/crio-371964dfaa135e6c517ac30cb2ad1694649ba7e9f439a9216386e05e78c495f8 WatchSource:0}: Error finding container 371964dfaa135e6c517ac30cb2ad1694649ba7e9f439a9216386e05e78c495f8: Status 404 returned error can't find the container with id 371964dfaa135e6c517ac30cb2ad1694649ba7e9f439a9216386e05e78c495f8 Apr 20 23:27:27.115655 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.115637 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 23:27:27.173190 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.173163 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-56b647664b-vm4d8"] Apr 20 23:27:27.188746 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.188721 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5fvg6\" (UniqueName: \"kubernetes.io/projected/53bbaab9-de56-45bd-bb8b-4325c2b35fe5-kube-api-access-5fvg6\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:27:27.974424 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.974293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-56b647664b-vm4d8" event={"ID":"d69653a3-74c0-43ae-a4b4-686e823d8a83","Type":"ContainerStarted","Data":"0fb34018480d64b44417186dcf9fd567b36cb47cd84c8dad849d3282c5da959d"} Apr 20 23:27:27.977530 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.977449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c65bd7f55-n2dww" Apr 20 23:27:27.977965 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:27.977940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" event={"ID":"3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a","Type":"ContainerStarted","Data":"371964dfaa135e6c517ac30cb2ad1694649ba7e9f439a9216386e05e78c495f8"} Apr 20 23:27:28.032511 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:28.031515 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5c65bd7f55-n2dww"] Apr 20 23:27:28.032511 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:28.031558 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5c65bd7f55-n2dww"] Apr 20 23:27:28.784292 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:28.784263 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53bbaab9-de56-45bd-bb8b-4325c2b35fe5" path="/var/lib/kubelet/pods/53bbaab9-de56-45bd-bb8b-4325c2b35fe5/volumes" Apr 20 23:27:30.991856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:30.991823 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-56b647664b-vm4d8" event={"ID":"d69653a3-74c0-43ae-a4b4-686e823d8a83","Type":"ContainerStarted","Data":"5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210"} Apr 20 23:27:30.991856 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:30.991872 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-56b647664b-vm4d8" Apr 20 23:27:30.993279 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:30.993244 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" event={"ID":"3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a","Type":"ContainerStarted","Data":"694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796"} Apr 20 23:27:30.993456 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:30.993384 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" Apr 20 23:27:31.010212 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:31.010169 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-56b647664b-vm4d8" podStartSLOduration=1.776489566 podStartE2EDuration="5.010157604s" podCreationTimestamp="2026-04-20 23:27:26 +0000 UTC" firstStartedPulling="2026-04-20 23:27:27.18003711 +0000 UTC m=+894.981170786" lastFinishedPulling="2026-04-20 23:27:30.413705147 +0000 UTC m=+898.214838824" observedRunningTime="2026-04-20 23:27:31.008882688 +0000 UTC m=+898.810016387" watchObservedRunningTime="2026-04-20 23:27:31.010157604 +0000 UTC m=+898.811291297" Apr 20 23:27:31.024173 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:31.024125 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" podStartSLOduration=1.729571636 podStartE2EDuration="5.024111266s" podCreationTimestamp="2026-04-20 23:27:26 +0000 UTC" firstStartedPulling="2026-04-20 23:27:27.115758667 +0000 UTC m=+894.916892341" lastFinishedPulling="2026-04-20 23:27:30.410298297 +0000 UTC m=+898.211431971" observedRunningTime="2026-04-20 23:27:31.023240961 +0000 UTC m=+898.824374659" watchObservedRunningTime="2026-04-20 23:27:31.024111266 +0000 UTC m=+898.825244965" Apr 20 23:27:33.234547 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.234510 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rkncs"] Apr 20 23:27:33.238398 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.238378 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:33.240549 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.240530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 23:27:33.240635 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.240562 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8v2gg\"" Apr 20 23:27:33.248070 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.248043 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rkncs"] Apr 20 23:27:33.327883 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.327850 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rkncs"] Apr 20 23:27:33.349422 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.349393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-config-file\") pod \"limitador-limitador-7d549b5b-rkncs\" (UID: \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:33.349610 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.349558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hdlz\" (UniqueName: \"kubernetes.io/projected/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-kube-api-access-2hdlz\") pod \"limitador-limitador-7d549b5b-rkncs\" (UID: \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:33.450126 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.450088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hdlz\" (UniqueName: \"kubernetes.io/projected/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-kube-api-access-2hdlz\") pod \"limitador-limitador-7d549b5b-rkncs\" (UID: \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:33.450325 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.450144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-config-file\") pod \"limitador-limitador-7d549b5b-rkncs\" (UID: \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:33.450797 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.450779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-config-file\") pod \"limitador-limitador-7d549b5b-rkncs\" (UID: \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:33.458221 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.458195 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hdlz\" (UniqueName: \"kubernetes.io/projected/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-kube-api-access-2hdlz\") pod \"limitador-limitador-7d549b5b-rkncs\" (UID: \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\") " pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:33.549514 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.549398 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:33.673874 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.673811 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rkncs"] Apr 20 23:27:33.676516 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:27:33.676456 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda644a9a5_1e7e_46ff_a51b_1e884f41d5f2.slice/crio-6f7535ebbd114674c13048e54a6a489cfccddec43f30d7bf4a35028b5f51ebe9 WatchSource:0}: Error finding container 6f7535ebbd114674c13048e54a6a489cfccddec43f30d7bf4a35028b5f51ebe9: Status 404 returned error can't find the container with id 6f7535ebbd114674c13048e54a6a489cfccddec43f30d7bf4a35028b5f51ebe9 Apr 20 23:27:33.721774 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:33.721750 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rkncs"] Apr 20 23:27:34.004416 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:34.004380 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" event={"ID":"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2","Type":"ContainerStarted","Data":"6f7535ebbd114674c13048e54a6a489cfccddec43f30d7bf4a35028b5f51ebe9"} Apr 20 23:27:37.016055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:37.016020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" event={"ID":"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2","Type":"ContainerStarted","Data":"c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a"} Apr 20 23:27:37.016415 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:37.016154 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:37.032627 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:37.032071 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" podStartSLOduration=1.629098983 podStartE2EDuration="4.032056059s" podCreationTimestamp="2026-04-20 23:27:33 +0000 UTC" firstStartedPulling="2026-04-20 23:27:33.678839344 +0000 UTC m=+901.479973021" lastFinishedPulling="2026-04-20 23:27:36.08179642 +0000 UTC m=+903.882930097" observedRunningTime="2026-04-20 23:27:37.03088214 +0000 UTC m=+904.832015850" watchObservedRunningTime="2026-04-20 23:27:37.032056059 +0000 UTC m=+904.833189758" Apr 20 23:27:42.001256 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.001225 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" Apr 20 23:27:42.001668 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.001600 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-56b647664b-vm4d8" Apr 20 23:27:42.052360 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.052333 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-t9xpr"] Apr 20 23:27:42.052534 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.052508 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" podUID="3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a" containerName="manager" containerID="cri-o://694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796" gracePeriod=10 Apr 20 23:27:42.291148 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.291127 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" Apr 20 23:27:42.338390 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.338356 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-947ff46f4-q2nfw"] Apr 20 23:27:42.338746 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.338734 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a" containerName="manager" Apr 20 23:27:42.338790 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.338748 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a" containerName="manager" Apr 20 23:27:42.338829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.338821 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a" containerName="manager" Apr 20 23:27:42.342000 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.341985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-947ff46f4-q2nfw" Apr 20 23:27:42.348068 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.348045 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-947ff46f4-q2nfw"] Apr 20 23:27:42.431848 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.431824 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsljx\" (UniqueName: \"kubernetes.io/projected/3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a-kube-api-access-tsljx\") pod \"3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a\" (UID: \"3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a\") " Apr 20 23:27:42.432010 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.431994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x4dz\" (UniqueName: \"kubernetes.io/projected/e8516d27-68a6-4dbc-b998-e137bb91f928-kube-api-access-7x4dz\") pod \"maas-controller-947ff46f4-q2nfw\" (UID: \"e8516d27-68a6-4dbc-b998-e137bb91f928\") " pod="opendatahub/maas-controller-947ff46f4-q2nfw" Apr 20 23:27:42.434083 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.434064 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a-kube-api-access-tsljx" (OuterVolumeSpecName: "kube-api-access-tsljx") pod "3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a" (UID: "3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a"). InnerVolumeSpecName "kube-api-access-tsljx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:27:42.533109 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.533048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x4dz\" (UniqueName: \"kubernetes.io/projected/e8516d27-68a6-4dbc-b998-e137bb91f928-kube-api-access-7x4dz\") pod \"maas-controller-947ff46f4-q2nfw\" (UID: \"e8516d27-68a6-4dbc-b998-e137bb91f928\") " pod="opendatahub/maas-controller-947ff46f4-q2nfw" Apr 20 23:27:42.533109 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.533091 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tsljx\" (UniqueName: \"kubernetes.io/projected/3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a-kube-api-access-tsljx\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:27:42.541128 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.541109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x4dz\" (UniqueName: \"kubernetes.io/projected/e8516d27-68a6-4dbc-b998-e137bb91f928-kube-api-access-7x4dz\") pod \"maas-controller-947ff46f4-q2nfw\" (UID: \"e8516d27-68a6-4dbc-b998-e137bb91f928\") " pod="opendatahub/maas-controller-947ff46f4-q2nfw" Apr 20 23:27:42.654362 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.654337 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-947ff46f4-q2nfw" Apr 20 23:27:42.773288 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:42.773267 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-947ff46f4-q2nfw"] Apr 20 23:27:42.775688 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:27:42.775660 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8516d27_68a6_4dbc_b998_e137bb91f928.slice/crio-f4c3d7c88b0534c319129e0a1b0609b4e88979308f6d7ac5a83e337ed5df3689 WatchSource:0}: Error finding container f4c3d7c88b0534c319129e0a1b0609b4e88979308f6d7ac5a83e337ed5df3689: Status 404 returned error can't find the container with id f4c3d7c88b0534c319129e0a1b0609b4e88979308f6d7ac5a83e337ed5df3689 Apr 20 23:27:43.038380 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.038344 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-947ff46f4-q2nfw" event={"ID":"e8516d27-68a6-4dbc-b998-e137bb91f928","Type":"ContainerStarted","Data":"f4c3d7c88b0534c319129e0a1b0609b4e88979308f6d7ac5a83e337ed5df3689"} Apr 20 23:27:43.039503 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.039457 2575 generic.go:358] "Generic (PLEG): container finished" podID="3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a" containerID="694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796" exitCode=0 Apr 20 23:27:43.039503 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.039496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" event={"ID":"3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a","Type":"ContainerDied","Data":"694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796"} Apr 20 23:27:43.039669 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.039529 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" event={"ID":"3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a","Type":"ContainerDied","Data":"371964dfaa135e6c517ac30cb2ad1694649ba7e9f439a9216386e05e78c495f8"} Apr 20 23:27:43.039669 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.039554 2575 scope.go:117] "RemoveContainer" containerID="694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796" Apr 20 23:27:43.039669 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.039559 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-t9xpr" Apr 20 23:27:43.047990 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.047948 2575 scope.go:117] "RemoveContainer" containerID="694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796" Apr 20 23:27:43.048211 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:27:43.048193 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796\": container with ID starting with 694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796 not found: ID does not exist" containerID="694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796" Apr 20 23:27:43.048264 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.048218 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796"} err="failed to get container status \"694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796\": rpc error: code = NotFound desc = could not find container \"694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796\": container with ID starting with 694315b2d20335c73ad228efedc23939202bd12b543a83fd87c94b593e6a3796 not found: ID does not exist" Apr 20 23:27:43.054144 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.054125 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-t9xpr"] Apr 20 23:27:43.056569 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:43.056544 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-t9xpr"] Apr 20 23:27:44.044535 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:44.044494 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-947ff46f4-q2nfw" event={"ID":"e8516d27-68a6-4dbc-b998-e137bb91f928","Type":"ContainerStarted","Data":"d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25"} Apr 20 23:27:44.044974 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:44.044712 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-947ff46f4-q2nfw" Apr 20 23:27:44.060575 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:44.060533 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-947ff46f4-q2nfw" podStartSLOduration=1.5951398289999998 podStartE2EDuration="2.060521684s" podCreationTimestamp="2026-04-20 23:27:42 +0000 UTC" firstStartedPulling="2026-04-20 23:27:42.777031413 +0000 UTC m=+910.578165087" lastFinishedPulling="2026-04-20 23:27:43.242413267 +0000 UTC m=+911.043546942" observedRunningTime="2026-04-20 23:27:44.05905444 +0000 UTC m=+911.860188136" watchObservedRunningTime="2026-04-20 23:27:44.060521684 +0000 UTC m=+911.861655380" Apr 20 23:27:44.783100 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:44.783060 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a" path="/var/lib/kubelet/pods/3cd0e51f-af2f-43d2-919c-b0b0cd84aa1a/volumes" Apr 20 23:27:48.021452 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:48.021425 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:48.840724 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:48.840693 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rkncs"] Apr 20 23:27:48.840919 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:48.840893 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" podUID="a644a9a5-1e7e-46ff-a51b-1e884f41d5f2" containerName="limitador" containerID="cri-o://c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a" gracePeriod=30 Apr 20 23:27:49.385159 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:49.385138 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:49.497038 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:49.496971 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-config-file\") pod \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\" (UID: \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\") " Apr 20 23:27:49.497038 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:49.497017 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hdlz\" (UniqueName: \"kubernetes.io/projected/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-kube-api-access-2hdlz\") pod \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\" (UID: \"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2\") " Apr 20 23:27:49.497294 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:49.497271 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-config-file" (OuterVolumeSpecName: "config-file") pod "a644a9a5-1e7e-46ff-a51b-1e884f41d5f2" (UID: "a644a9a5-1e7e-46ff-a51b-1e884f41d5f2"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 23:27:49.497398 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:49.497373 2575 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-config-file\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:27:49.499167 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:49.499145 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-kube-api-access-2hdlz" (OuterVolumeSpecName: "kube-api-access-2hdlz") pod "a644a9a5-1e7e-46ff-a51b-1e884f41d5f2" (UID: "a644a9a5-1e7e-46ff-a51b-1e884f41d5f2"). InnerVolumeSpecName "kube-api-access-2hdlz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:27:49.598053 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:49.598015 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2hdlz\" (UniqueName: \"kubernetes.io/projected/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2-kube-api-access-2hdlz\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:27:50.067645 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.067606 2575 generic.go:358] "Generic (PLEG): container finished" podID="a644a9a5-1e7e-46ff-a51b-1e884f41d5f2" containerID="c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a" exitCode=0 Apr 20 23:27:50.067829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.067664 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" event={"ID":"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2","Type":"ContainerDied","Data":"c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a"} Apr 20 23:27:50.067829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.067691 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" Apr 20 23:27:50.067829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.067710 2575 scope.go:117] "RemoveContainer" containerID="c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a" Apr 20 23:27:50.067829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.067698 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-rkncs" event={"ID":"a644a9a5-1e7e-46ff-a51b-1e884f41d5f2","Type":"ContainerDied","Data":"6f7535ebbd114674c13048e54a6a489cfccddec43f30d7bf4a35028b5f51ebe9"} Apr 20 23:27:50.076528 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.076505 2575 scope.go:117] "RemoveContainer" containerID="c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a" Apr 20 23:27:50.076778 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:27:50.076758 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a\": container with ID starting with c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a not found: ID does not exist" containerID="c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a" Apr 20 23:27:50.076834 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.076787 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a"} err="failed to get container status \"c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a\": rpc error: code = NotFound desc = could not find container \"c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a\": container with ID starting with c6ec828943aff6513156b1ff52a07a500af0d0add07fde6709a06009ab7f1e7a not found: ID does not exist" Apr 20 23:27:50.087349 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.087329 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rkncs"] Apr 20 23:27:50.089404 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.089380 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-rkncs"] Apr 20 23:27:50.785679 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:50.785642 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a644a9a5-1e7e-46ff-a51b-1e884f41d5f2" path="/var/lib/kubelet/pods/a644a9a5-1e7e-46ff-a51b-1e884f41d5f2/volumes" Apr 20 23:27:55.053889 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:55.053863 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-947ff46f4-q2nfw" Apr 20 23:27:55.089786 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:55.089755 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-56b647664b-vm4d8"] Apr 20 23:27:55.090026 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:55.090001 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-56b647664b-vm4d8" podUID="d69653a3-74c0-43ae-a4b4-686e823d8a83" containerName="manager" containerID="cri-o://5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210" gracePeriod=10 Apr 20 23:27:55.339841 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:55.339819 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-56b647664b-vm4d8" Apr 20 23:27:55.446581 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:55.446557 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tccqv\" (UniqueName: \"kubernetes.io/projected/d69653a3-74c0-43ae-a4b4-686e823d8a83-kube-api-access-tccqv\") pod \"d69653a3-74c0-43ae-a4b4-686e823d8a83\" (UID: \"d69653a3-74c0-43ae-a4b4-686e823d8a83\") " Apr 20 23:27:55.448710 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:55.448684 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d69653a3-74c0-43ae-a4b4-686e823d8a83-kube-api-access-tccqv" (OuterVolumeSpecName: "kube-api-access-tccqv") pod "d69653a3-74c0-43ae-a4b4-686e823d8a83" (UID: "d69653a3-74c0-43ae-a4b4-686e823d8a83"). InnerVolumeSpecName "kube-api-access-tccqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:27:55.547985 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:55.547944 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tccqv\" (UniqueName: \"kubernetes.io/projected/d69653a3-74c0-43ae-a4b4-686e823d8a83-kube-api-access-tccqv\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:27:56.090658 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.090621 2575 generic.go:358] "Generic (PLEG): container finished" podID="d69653a3-74c0-43ae-a4b4-686e823d8a83" containerID="5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210" exitCode=0 Apr 20 23:27:56.090658 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.090659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-56b647664b-vm4d8" event={"ID":"d69653a3-74c0-43ae-a4b4-686e823d8a83","Type":"ContainerDied","Data":"5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210"} Apr 20 23:27:56.091168 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.090685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-56b647664b-vm4d8" event={"ID":"d69653a3-74c0-43ae-a4b4-686e823d8a83","Type":"ContainerDied","Data":"0fb34018480d64b44417186dcf9fd567b36cb47cd84c8dad849d3282c5da959d"} Apr 20 23:27:56.091168 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.090699 2575 scope.go:117] "RemoveContainer" containerID="5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210" Apr 20 23:27:56.091168 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.090710 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-56b647664b-vm4d8" Apr 20 23:27:56.098955 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.098939 2575 scope.go:117] "RemoveContainer" containerID="5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210" Apr 20 23:27:56.099210 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:27:56.099191 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210\": container with ID starting with 5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210 not found: ID does not exist" containerID="5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210" Apr 20 23:27:56.099270 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.099218 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210"} err="failed to get container status \"5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210\": rpc error: code = NotFound desc = could not find container \"5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210\": container with ID starting with 5a004080e1b710a800d209e2ff211fdb7b91b693e98b4ca975a50c15b7f88210 not found: ID does not exist" Apr 20 23:27:56.114141 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.114114 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-56b647664b-vm4d8"] Apr 20 23:27:56.120085 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.120065 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-56b647664b-vm4d8"] Apr 20 23:27:56.783129 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:27:56.783094 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d69653a3-74c0-43ae-a4b4-686e823d8a83" path="/var/lib/kubelet/pods/d69653a3-74c0-43ae-a4b4-686e823d8a83/volumes" Apr 20 23:28:01.220799 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.220765 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6679f859c-2b5nf"] Apr 20 23:28:01.221247 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.221142 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d69653a3-74c0-43ae-a4b4-686e823d8a83" containerName="manager" Apr 20 23:28:01.221247 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.221152 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d69653a3-74c0-43ae-a4b4-686e823d8a83" containerName="manager" Apr 20 23:28:01.221247 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.221162 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a644a9a5-1e7e-46ff-a51b-1e884f41d5f2" containerName="limitador" Apr 20 23:28:01.221247 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.221167 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a644a9a5-1e7e-46ff-a51b-1e884f41d5f2" containerName="limitador" Apr 20 23:28:01.221391 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.221258 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d69653a3-74c0-43ae-a4b4-686e823d8a83" containerName="manager" Apr 20 23:28:01.221391 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.221268 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a644a9a5-1e7e-46ff-a51b-1e884f41d5f2" containerName="limitador" Apr 20 23:28:01.224809 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.224791 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:01.228147 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.228123 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 23:28:01.228301 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.228120 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 23:28:01.228550 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.228531 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-kgbkh\"" Apr 20 23:28:01.230343 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.230321 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6679f859c-2b5nf"] Apr 20 23:28:01.297075 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.297041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jhc\" (UniqueName: \"kubernetes.io/projected/3fb61f72-1c00-46b0-979c-3990aab49c8c-kube-api-access-92jhc\") pod \"maas-api-6679f859c-2b5nf\" (UID: \"3fb61f72-1c00-46b0-979c-3990aab49c8c\") " pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:01.297230 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.297124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fb61f72-1c00-46b0-979c-3990aab49c8c-maas-api-tls\") pod \"maas-api-6679f859c-2b5nf\" (UID: \"3fb61f72-1c00-46b0-979c-3990aab49c8c\") " pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:01.398173 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.398136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fb61f72-1c00-46b0-979c-3990aab49c8c-maas-api-tls\") pod \"maas-api-6679f859c-2b5nf\" (UID: \"3fb61f72-1c00-46b0-979c-3990aab49c8c\") " pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:01.398334 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.398194 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92jhc\" (UniqueName: \"kubernetes.io/projected/3fb61f72-1c00-46b0-979c-3990aab49c8c-kube-api-access-92jhc\") pod \"maas-api-6679f859c-2b5nf\" (UID: \"3fb61f72-1c00-46b0-979c-3990aab49c8c\") " pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:01.400862 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.400836 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fb61f72-1c00-46b0-979c-3990aab49c8c-maas-api-tls\") pod \"maas-api-6679f859c-2b5nf\" (UID: \"3fb61f72-1c00-46b0-979c-3990aab49c8c\") " pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:01.406539 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.406517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jhc\" (UniqueName: \"kubernetes.io/projected/3fb61f72-1c00-46b0-979c-3990aab49c8c-kube-api-access-92jhc\") pod \"maas-api-6679f859c-2b5nf\" (UID: \"3fb61f72-1c00-46b0-979c-3990aab49c8c\") " pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:01.536635 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.536539 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:01.661017 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:01.660982 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6679f859c-2b5nf"] Apr 20 23:28:02.115527 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:02.115460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6679f859c-2b5nf" event={"ID":"3fb61f72-1c00-46b0-979c-3990aab49c8c","Type":"ContainerStarted","Data":"14f50f607d228e24fca1c77f563d05208693ff85462f1c53fb9d8deb64b77a3e"} Apr 20 23:28:03.120761 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:03.120733 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6679f859c-2b5nf" event={"ID":"3fb61f72-1c00-46b0-979c-3990aab49c8c","Type":"ContainerStarted","Data":"9b814fdcb43022cb92b10545d529826240606141bfd96a61c4c81f5c7e7842c1"} Apr 20 23:28:03.121090 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:03.120881 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:03.136584 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:03.136540 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6679f859c-2b5nf" podStartSLOduration=0.773589612 podStartE2EDuration="2.136527096s" podCreationTimestamp="2026-04-20 23:28:01 +0000 UTC" firstStartedPulling="2026-04-20 23:28:01.667496846 +0000 UTC m=+929.468630519" lastFinishedPulling="2026-04-20 23:28:03.030434326 +0000 UTC m=+930.831568003" observedRunningTime="2026-04-20 23:28:03.134589393 +0000 UTC m=+930.935723089" watchObservedRunningTime="2026-04-20 23:28:03.136527096 +0000 UTC m=+930.937660791" Apr 20 23:28:08.822254 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.822219 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v"] Apr 20 23:28:08.829403 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.829382 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.834298 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.833638 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 23:28:08.834298 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.833721 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 23:28:08.834298 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.833875 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-2w8th\"" Apr 20 23:28:08.834298 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.833935 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 23:28:08.834298 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.834073 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v"] Apr 20 23:28:08.862848 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.862813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.863006 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.862856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.863006 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.862972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/914ed9b6-f210-4675-bb6a-ed4c46169493-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.863131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.863009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5llf\" (UniqueName: \"kubernetes.io/projected/914ed9b6-f210-4675-bb6a-ed4c46169493-kube-api-access-w5llf\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.863131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.863032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.863131 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.863089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.964396 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.964357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.964592 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.964427 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.964592 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.964453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.964592 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.964490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/914ed9b6-f210-4675-bb6a-ed4c46169493-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.964592 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.964515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5llf\" (UniqueName: \"kubernetes.io/projected/914ed9b6-f210-4675-bb6a-ed4c46169493-kube-api-access-w5llf\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.964592 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.964545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.964895 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.964872 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.964974 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.964906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.964974 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.964922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.966739 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.966717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/914ed9b6-f210-4675-bb6a-ed4c46169493-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.967123 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.967103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/914ed9b6-f210-4675-bb6a-ed4c46169493-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:08.972283 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:08.972259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5llf\" (UniqueName: \"kubernetes.io/projected/914ed9b6-f210-4675-bb6a-ed4c46169493-kube-api-access-w5llf\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v\" (UID: \"914ed9b6-f210-4675-bb6a-ed4c46169493\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:09.129829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:09.129758 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6679f859c-2b5nf" Apr 20 23:28:09.142950 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:09.142918 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:09.277526 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:09.277504 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v"] Apr 20 23:28:09.279718 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:28:09.279690 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod914ed9b6_f210_4675_bb6a_ed4c46169493.slice/crio-7a6080faa63636cfd6a536f704114a7454f828228feaf179241fed4b0349c0eb WatchSource:0}: Error finding container 7a6080faa63636cfd6a536f704114a7454f828228feaf179241fed4b0349c0eb: Status 404 returned error can't find the container with id 7a6080faa63636cfd6a536f704114a7454f828228feaf179241fed4b0349c0eb Apr 20 23:28:10.145750 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:10.145713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" event={"ID":"914ed9b6-f210-4675-bb6a-ed4c46169493","Type":"ContainerStarted","Data":"7a6080faa63636cfd6a536f704114a7454f828228feaf179241fed4b0349c0eb"} Apr 20 23:28:15.169500 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:15.169444 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" event={"ID":"914ed9b6-f210-4675-bb6a-ed4c46169493","Type":"ContainerStarted","Data":"f7187df026ac8d8b033745e3b1d81dc1345c9c8c64f787737d4d13ccbf826271"} Apr 20 23:28:18.765619 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.765572 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2"] Apr 20 23:28:18.769628 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.769604 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.771861 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.771837 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 23:28:18.785673 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.785649 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2"] Apr 20 23:28:18.860978 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.860936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nc4l\" (UniqueName: \"kubernetes.io/projected/0ffd655b-f407-4f65-ba1d-219a3f1e9750-kube-api-access-9nc4l\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.861116 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.861039 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.861116 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.861089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.861234 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.861218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.861589 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.861324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ffd655b-f407-4f65-ba1d-219a3f1e9750-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.861589 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.861438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.962426 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.962393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.962636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.962455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ffd655b-f407-4f65-ba1d-219a3f1e9750-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.962636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.962522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.962636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.962578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nc4l\" (UniqueName: \"kubernetes.io/projected/0ffd655b-f407-4f65-ba1d-219a3f1e9750-kube-api-access-9nc4l\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.962636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.962622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.962863 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.962647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.962863 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.962828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.962987 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.962862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.963049 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.962995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.964976 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.964952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0ffd655b-f407-4f65-ba1d-219a3f1e9750-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.965175 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.965152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ffd655b-f407-4f65-ba1d-219a3f1e9750-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:18.969898 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:18.969872 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nc4l\" (UniqueName: \"kubernetes.io/projected/0ffd655b-f407-4f65-ba1d-219a3f1e9750-kube-api-access-9nc4l\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2\" (UID: \"0ffd655b-f407-4f65-ba1d-219a3f1e9750\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:19.083398 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:19.083361 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:19.218284 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:19.218255 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2"] Apr 20 23:28:19.219883 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:28:19.219858 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ffd655b_f407_4f65_ba1d_219a3f1e9750.slice/crio-6c3466603aaf29726688e9dc792ac628f6ecc8852ec32036a584293bdafc32c5 WatchSource:0}: Error finding container 6c3466603aaf29726688e9dc792ac628f6ecc8852ec32036a584293bdafc32c5: Status 404 returned error can't find the container with id 6c3466603aaf29726688e9dc792ac628f6ecc8852ec32036a584293bdafc32c5 Apr 20 23:28:20.191739 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:20.191701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" event={"ID":"0ffd655b-f407-4f65-ba1d-219a3f1e9750","Type":"ContainerStarted","Data":"d2c0bf932787b758d3b6d2f06b9ecc48ee309697348f3e11278dc6d73517a735"} Apr 20 23:28:20.191739 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:20.191741 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" event={"ID":"0ffd655b-f407-4f65-ba1d-219a3f1e9750","Type":"ContainerStarted","Data":"6c3466603aaf29726688e9dc792ac628f6ecc8852ec32036a584293bdafc32c5"} Apr 20 23:28:21.196675 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:21.196640 2575 generic.go:358] "Generic (PLEG): container finished" podID="914ed9b6-f210-4675-bb6a-ed4c46169493" containerID="f7187df026ac8d8b033745e3b1d81dc1345c9c8c64f787737d4d13ccbf826271" exitCode=0 Apr 20 23:28:21.197139 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:21.196732 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" event={"ID":"914ed9b6-f210-4675-bb6a-ed4c46169493","Type":"ContainerDied","Data":"f7187df026ac8d8b033745e3b1d81dc1345c9c8c64f787737d4d13ccbf826271"} Apr 20 23:28:23.205759 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:23.205728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" event={"ID":"914ed9b6-f210-4675-bb6a-ed4c46169493","Type":"ContainerStarted","Data":"30dba577915ae17bddad914823fb7317d9af6b171ee9a2fdb5f066faa31b8645"} Apr 20 23:28:23.206135 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:23.205961 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:23.224874 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:23.224814 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" podStartSLOduration=2.143279027 podStartE2EDuration="15.224798371s" podCreationTimestamp="2026-04-20 23:28:08 +0000 UTC" firstStartedPulling="2026-04-20 23:28:09.281363293 +0000 UTC m=+937.082496967" lastFinishedPulling="2026-04-20 23:28:22.362882625 +0000 UTC m=+950.164016311" observedRunningTime="2026-04-20 23:28:23.221139964 +0000 UTC m=+951.022273672" watchObservedRunningTime="2026-04-20 23:28:23.224798371 +0000 UTC m=+951.025932069" Apr 20 23:28:25.213605 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:25.213568 2575 generic.go:358] "Generic (PLEG): container finished" podID="0ffd655b-f407-4f65-ba1d-219a3f1e9750" containerID="d2c0bf932787b758d3b6d2f06b9ecc48ee309697348f3e11278dc6d73517a735" exitCode=0 Apr 20 23:28:25.214001 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:25.213640 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" event={"ID":"0ffd655b-f407-4f65-ba1d-219a3f1e9750","Type":"ContainerDied","Data":"d2c0bf932787b758d3b6d2f06b9ecc48ee309697348f3e11278dc6d73517a735"} Apr 20 23:28:26.219358 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:26.219324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" event={"ID":"0ffd655b-f407-4f65-ba1d-219a3f1e9750","Type":"ContainerStarted","Data":"21a6c16f29c4ba68c11b850710c2ecfb9e7cd3981738b527b4e03e413e28d9b9"} Apr 20 23:28:26.219754 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:26.219538 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:26.237926 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:26.237872 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" podStartSLOduration=8.049621988 podStartE2EDuration="8.237857285s" podCreationTimestamp="2026-04-20 23:28:18 +0000 UTC" firstStartedPulling="2026-04-20 23:28:25.214219192 +0000 UTC m=+953.015352866" lastFinishedPulling="2026-04-20 23:28:25.402454489 +0000 UTC m=+953.203588163" observedRunningTime="2026-04-20 23:28:26.23657108 +0000 UTC m=+954.037704776" watchObservedRunningTime="2026-04-20 23:28:26.237857285 +0000 UTC m=+954.038990985" Apr 20 23:28:34.222743 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.222714 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v" Apr 20 23:28:34.667713 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.667678 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4"] Apr 20 23:28:34.673301 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.673270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.675406 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.675389 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 23:28:34.679042 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.679018 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4"] Apr 20 23:28:34.804579 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.804550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f06921bb-cd18-4a6d-91a1-ac142f2fa429-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.804716 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.804617 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.804716 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.804665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.804716 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.804685 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw9lh\" (UniqueName: \"kubernetes.io/projected/f06921bb-cd18-4a6d-91a1-ac142f2fa429-kube-api-access-cw9lh\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.804829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.804732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.804829 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.804787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.905141 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.905112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.905282 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.905150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f06921bb-cd18-4a6d-91a1-ac142f2fa429-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.905282 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.905179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.905282 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.905223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.905282 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.905239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw9lh\" (UniqueName: \"kubernetes.io/projected/f06921bb-cd18-4a6d-91a1-ac142f2fa429-kube-api-access-cw9lh\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.905282 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.905268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.905613 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.905587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.905817 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.905618 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.905817 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.905709 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.907666 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.907651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f06921bb-cd18-4a6d-91a1-ac142f2fa429-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.907911 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.907893 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f06921bb-cd18-4a6d-91a1-ac142f2fa429-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.913149 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.913124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw9lh\" (UniqueName: \"kubernetes.io/projected/f06921bb-cd18-4a6d-91a1-ac142f2fa429-kube-api-access-cw9lh\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4\" (UID: \"f06921bb-cd18-4a6d-91a1-ac142f2fa429\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:34.984693 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:34.984635 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:35.106281 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:35.106251 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4"] Apr 20 23:28:35.109664 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:28:35.109635 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf06921bb_cd18_4a6d_91a1_ac142f2fa429.slice/crio-614ff0cf5c48f83b10b986de76a6a33baee351884bf63b748b91beea7a7e79bd WatchSource:0}: Error finding container 614ff0cf5c48f83b10b986de76a6a33baee351884bf63b748b91beea7a7e79bd: Status 404 returned error can't find the container with id 614ff0cf5c48f83b10b986de76a6a33baee351884bf63b748b91beea7a7e79bd Apr 20 23:28:35.255913 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:35.255814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" event={"ID":"f06921bb-cd18-4a6d-91a1-ac142f2fa429","Type":"ContainerStarted","Data":"873d1e54b7a3e1eac7ae587e61393fb86904ce2329fef897e6e778c3e7f907b0"} Apr 20 23:28:35.255913 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:35.255862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" event={"ID":"f06921bb-cd18-4a6d-91a1-ac142f2fa429","Type":"ContainerStarted","Data":"614ff0cf5c48f83b10b986de76a6a33baee351884bf63b748b91beea7a7e79bd"} Apr 20 23:28:37.235820 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:37.235774 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2" Apr 20 23:28:40.278384 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:40.278357 2575 generic.go:358] "Generic (PLEG): container finished" podID="f06921bb-cd18-4a6d-91a1-ac142f2fa429" containerID="873d1e54b7a3e1eac7ae587e61393fb86904ce2329fef897e6e778c3e7f907b0" exitCode=0 Apr 20 23:28:40.278802 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:40.278409 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" event={"ID":"f06921bb-cd18-4a6d-91a1-ac142f2fa429","Type":"ContainerDied","Data":"873d1e54b7a3e1eac7ae587e61393fb86904ce2329fef897e6e778c3e7f907b0"} Apr 20 23:28:41.283111 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:41.283073 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" event={"ID":"f06921bb-cd18-4a6d-91a1-ac142f2fa429","Type":"ContainerStarted","Data":"5e3a56b887e3a70b7e7bd932758215dc86c1f6fd38d90781f6efa66df2ca5238"} Apr 20 23:28:41.283624 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:41.283292 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:28:41.302910 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:41.302863 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" podStartSLOduration=7.131377811 podStartE2EDuration="7.302850686s" podCreationTimestamp="2026-04-20 23:28:34 +0000 UTC" firstStartedPulling="2026-04-20 23:28:40.279162765 +0000 UTC m=+968.080296440" lastFinishedPulling="2026-04-20 23:28:40.450635638 +0000 UTC m=+968.251769315" observedRunningTime="2026-04-20 23:28:41.299860504 +0000 UTC m=+969.100994199" watchObservedRunningTime="2026-04-20 23:28:41.302850686 +0000 UTC m=+969.103984381" Apr 20 23:28:52.299014 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:28:52.298934 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4" Apr 20 23:30:57.011304 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.011267 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-947ff46f4-q2nfw"] Apr 20 23:30:57.011806 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.011526 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-947ff46f4-q2nfw" podUID="e8516d27-68a6-4dbc-b998-e137bb91f928" containerName="manager" containerID="cri-o://d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25" gracePeriod=10 Apr 20 23:30:57.258166 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.258144 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-947ff46f4-q2nfw" Apr 20 23:30:57.310827 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.310797 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x4dz\" (UniqueName: \"kubernetes.io/projected/e8516d27-68a6-4dbc-b998-e137bb91f928-kube-api-access-7x4dz\") pod \"e8516d27-68a6-4dbc-b998-e137bb91f928\" (UID: \"e8516d27-68a6-4dbc-b998-e137bb91f928\") " Apr 20 23:30:57.312939 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.312913 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8516d27-68a6-4dbc-b998-e137bb91f928-kube-api-access-7x4dz" (OuterVolumeSpecName: "kube-api-access-7x4dz") pod "e8516d27-68a6-4dbc-b998-e137bb91f928" (UID: "e8516d27-68a6-4dbc-b998-e137bb91f928"). InnerVolumeSpecName "kube-api-access-7x4dz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:30:57.411959 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.411928 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7x4dz\" (UniqueName: \"kubernetes.io/projected/e8516d27-68a6-4dbc-b998-e137bb91f928-kube-api-access-7x4dz\") on node \"ip-10-0-131-251.ec2.internal\" DevicePath \"\"" Apr 20 23:30:57.775270 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.775177 2575 generic.go:358] "Generic (PLEG): container finished" podID="e8516d27-68a6-4dbc-b998-e137bb91f928" containerID="d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25" exitCode=0 Apr 20 23:30:57.775270 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.775244 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-947ff46f4-q2nfw" Apr 20 23:30:57.775270 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.775262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-947ff46f4-q2nfw" event={"ID":"e8516d27-68a6-4dbc-b998-e137bb91f928","Type":"ContainerDied","Data":"d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25"} Apr 20 23:30:57.775588 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.775299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-947ff46f4-q2nfw" event={"ID":"e8516d27-68a6-4dbc-b998-e137bb91f928","Type":"ContainerDied","Data":"f4c3d7c88b0534c319129e0a1b0609b4e88979308f6d7ac5a83e337ed5df3689"} Apr 20 23:30:57.775588 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.775315 2575 scope.go:117] "RemoveContainer" containerID="d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25" Apr 20 23:30:57.783879 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.783863 2575 scope.go:117] "RemoveContainer" containerID="d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25" Apr 20 23:30:57.784134 ip-10-0-131-251 kubenswrapper[2575]: E0420 23:30:57.784117 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25\": container with ID starting with d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25 not found: ID does not exist" containerID="d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25" Apr 20 23:30:57.784184 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.784142 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25"} err="failed to get container status \"d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25\": rpc error: code = NotFound desc = could not find container \"d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25\": container with ID starting with d5607b7763210dcdc21c1eee6a91a34e92635a984ede101a7f0e6f60eea46e25 not found: ID does not exist" Apr 20 23:30:57.797668 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.797645 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-947ff46f4-q2nfw"] Apr 20 23:30:57.801666 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:57.801649 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-947ff46f4-q2nfw"] Apr 20 23:30:58.543755 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.543721 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-947ff46f4-db4j7"] Apr 20 23:30:58.544145 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.544132 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8516d27-68a6-4dbc-b998-e137bb91f928" containerName="manager" Apr 20 23:30:58.544188 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.544147 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8516d27-68a6-4dbc-b998-e137bb91f928" containerName="manager" Apr 20 23:30:58.544238 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.544230 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8516d27-68a6-4dbc-b998-e137bb91f928" containerName="manager" Apr 20 23:30:58.548589 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.548575 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-947ff46f4-db4j7" Apr 20 23:30:58.550896 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.550874 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-vgvsr\"" Apr 20 23:30:58.554456 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.554435 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-947ff46f4-db4j7"] Apr 20 23:30:58.621500 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.621454 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mckf4\" (UniqueName: \"kubernetes.io/projected/af2c6fed-5d13-407f-a299-a4775bb80d63-kube-api-access-mckf4\") pod \"maas-controller-947ff46f4-db4j7\" (UID: \"af2c6fed-5d13-407f-a299-a4775bb80d63\") " pod="opendatahub/maas-controller-947ff46f4-db4j7" Apr 20 23:30:58.722424 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.722402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mckf4\" (UniqueName: \"kubernetes.io/projected/af2c6fed-5d13-407f-a299-a4775bb80d63-kube-api-access-mckf4\") pod \"maas-controller-947ff46f4-db4j7\" (UID: \"af2c6fed-5d13-407f-a299-a4775bb80d63\") " pod="opendatahub/maas-controller-947ff46f4-db4j7" Apr 20 23:30:58.730297 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.730273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mckf4\" (UniqueName: \"kubernetes.io/projected/af2c6fed-5d13-407f-a299-a4775bb80d63-kube-api-access-mckf4\") pod \"maas-controller-947ff46f4-db4j7\" (UID: \"af2c6fed-5d13-407f-a299-a4775bb80d63\") " pod="opendatahub/maas-controller-947ff46f4-db4j7" Apr 20 23:30:58.781773 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.781751 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8516d27-68a6-4dbc-b998-e137bb91f928" path="/var/lib/kubelet/pods/e8516d27-68a6-4dbc-b998-e137bb91f928/volumes" Apr 20 23:30:58.859868 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.859846 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-947ff46f4-db4j7" Apr 20 23:30:58.985073 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:58.985049 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-947ff46f4-db4j7"] Apr 20 23:30:58.987423 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:30:58.987395 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf2c6fed_5d13_407f_a299_a4775bb80d63.slice/crio-3a3b965774f9713ef2ddf57cdfb7725d0545027ebb64378d1d2ea540e8d78fe4 WatchSource:0}: Error finding container 3a3b965774f9713ef2ddf57cdfb7725d0545027ebb64378d1d2ea540e8d78fe4: Status 404 returned error can't find the container with id 3a3b965774f9713ef2ddf57cdfb7725d0545027ebb64378d1d2ea540e8d78fe4 Apr 20 23:30:59.784168 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:59.784080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-947ff46f4-db4j7" event={"ID":"af2c6fed-5d13-407f-a299-a4775bb80d63","Type":"ContainerStarted","Data":"fc2a0152b81673db7c54130accaf4b61add259a7a89820f7190ddf86effc0f14"} Apr 20 23:30:59.784168 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:59.784124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-947ff46f4-db4j7" event={"ID":"af2c6fed-5d13-407f-a299-a4775bb80d63","Type":"ContainerStarted","Data":"3a3b965774f9713ef2ddf57cdfb7725d0545027ebb64378d1d2ea540e8d78fe4"} Apr 20 23:30:59.784569 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:59.784200 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-947ff46f4-db4j7" Apr 20 23:30:59.800094 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:30:59.800048 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-947ff46f4-db4j7" podStartSLOduration=1.312677275 podStartE2EDuration="1.800036144s" podCreationTimestamp="2026-04-20 23:30:58 +0000 UTC" firstStartedPulling="2026-04-20 23:30:58.988735127 +0000 UTC m=+1106.789868802" lastFinishedPulling="2026-04-20 23:30:59.476093997 +0000 UTC m=+1107.277227671" observedRunningTime="2026-04-20 23:30:59.798934086 +0000 UTC m=+1107.600067783" watchObservedRunningTime="2026-04-20 23:30:59.800036144 +0000 UTC m=+1107.601169843" Apr 20 23:31:10.792884 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:31:10.792849 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-947ff46f4-db4j7" Apr 20 23:31:57.047008 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:31:57.046979 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6679f859c-2b5nf_3fb61f72-1c00-46b0-979c-3990aab49c8c/maas-api/0.log" Apr 20 23:31:57.153078 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:31:57.153049 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-947ff46f4-db4j7_af2c6fed-5d13-407f-a299-a4775bb80d63/manager/0.log" Apr 20 23:31:57.508343 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:31:57.508312 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d79c565b7-gf9d4_db0c3048-a728-4289-96f2-bdb02f8643d1/manager/0.log" Apr 20 23:31:57.743484 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:31:57.743448 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-68c4fbbd6f-5zhbk_2e117427-3bf5-420d-8c87-a22a6a6a543a/postgres/0.log" Apr 20 23:31:59.215680 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:31:59.215648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-lb4t4_77b407ed-9def-48e7-af38-280d038271ff/manager/0.log" Apr 20 23:32:00.381068 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:00.381041 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-74c79b5d98-pkd4n_d040c532-c76a-4ff0-a162-340acb927c38/kube-auth-proxy/0.log" Apr 20 23:32:01.065970 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:01.065942 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v_914ed9b6-f210-4675-bb6a-ed4c46169493/storage-initializer/0.log" Apr 20 23:32:01.073134 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:01.073109 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-8c89v_914ed9b6-f210-4675-bb6a-ed4c46169493/main/0.log" Apr 20 23:32:01.180843 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:01.180811 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2_0ffd655b-f407-4f65-ba1d-219a3f1e9750/storage-initializer/0.log" Apr 20 23:32:01.188029 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:01.188011 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-k84m2_0ffd655b-f407-4f65-ba1d-219a3f1e9750/main/0.log" Apr 20 23:32:01.405955 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:01.405885 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4_f06921bb-cd18-4a6d-91a1-ac142f2fa429/storage-initializer/0.log" Apr 20 23:32:01.412923 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:01.412901 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc6spc4_f06921bb-cd18-4a6d-91a1-ac142f2fa429/main/0.log" Apr 20 23:32:05.278838 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.278802 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2tt9/must-gather-sxw5s"] Apr 20 23:32:05.282589 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.282573 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2tt9/must-gather-sxw5s" Apr 20 23:32:05.285126 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.285106 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2tt9\"/\"openshift-service-ca.crt\"" Apr 20 23:32:05.285217 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.285173 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2tt9\"/\"kube-root-ca.crt\"" Apr 20 23:32:05.286122 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.286106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d2tt9\"/\"default-dockercfg-5mhdc\"" Apr 20 23:32:05.290849 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.290830 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2tt9/must-gather-sxw5s"] Apr 20 23:32:05.407982 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.407956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nhz\" (UniqueName: \"kubernetes.io/projected/f123be02-9565-47c5-8a90-d4576591042b-kube-api-access-g7nhz\") pod \"must-gather-sxw5s\" (UID: \"f123be02-9565-47c5-8a90-d4576591042b\") " pod="openshift-must-gather-d2tt9/must-gather-sxw5s" Apr 20 23:32:05.408120 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.408049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f123be02-9565-47c5-8a90-d4576591042b-must-gather-output\") pod \"must-gather-sxw5s\" (UID: \"f123be02-9565-47c5-8a90-d4576591042b\") " pod="openshift-must-gather-d2tt9/must-gather-sxw5s" Apr 20 23:32:05.509275 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.509238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f123be02-9565-47c5-8a90-d4576591042b-must-gather-output\") pod \"must-gather-sxw5s\" (UID: \"f123be02-9565-47c5-8a90-d4576591042b\") " pod="openshift-must-gather-d2tt9/must-gather-sxw5s" Apr 20 23:32:05.509422 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.509327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nhz\" (UniqueName: \"kubernetes.io/projected/f123be02-9565-47c5-8a90-d4576591042b-kube-api-access-g7nhz\") pod \"must-gather-sxw5s\" (UID: \"f123be02-9565-47c5-8a90-d4576591042b\") " pod="openshift-must-gather-d2tt9/must-gather-sxw5s" Apr 20 23:32:05.509604 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.509586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f123be02-9565-47c5-8a90-d4576591042b-must-gather-output\") pod \"must-gather-sxw5s\" (UID: \"f123be02-9565-47c5-8a90-d4576591042b\") " pod="openshift-must-gather-d2tt9/must-gather-sxw5s" Apr 20 23:32:05.517147 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.517120 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nhz\" (UniqueName: \"kubernetes.io/projected/f123be02-9565-47c5-8a90-d4576591042b-kube-api-access-g7nhz\") pod \"must-gather-sxw5s\" (UID: \"f123be02-9565-47c5-8a90-d4576591042b\") " pod="openshift-must-gather-d2tt9/must-gather-sxw5s" Apr 20 23:32:05.614555 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.614532 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2tt9/must-gather-sxw5s" Apr 20 23:32:05.736982 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:05.736958 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2tt9/must-gather-sxw5s"] Apr 20 23:32:05.739034 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:32:05.739001 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf123be02_9565_47c5_8a90_d4576591042b.slice/crio-3c899bf4ca7a9beec1ff345a298f4192cc94fac221b9578e56859096f53ff256 WatchSource:0}: Error finding container 3c899bf4ca7a9beec1ff345a298f4192cc94fac221b9578e56859096f53ff256: Status 404 returned error can't find the container with id 3c899bf4ca7a9beec1ff345a298f4192cc94fac221b9578e56859096f53ff256 Apr 20 23:32:06.015011 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:06.014919 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2tt9/must-gather-sxw5s" event={"ID":"f123be02-9565-47c5-8a90-d4576591042b","Type":"ContainerStarted","Data":"3c899bf4ca7a9beec1ff345a298f4192cc94fac221b9578e56859096f53ff256"} Apr 20 23:32:07.021093 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:07.021051 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2tt9/must-gather-sxw5s" event={"ID":"f123be02-9565-47c5-8a90-d4576591042b","Type":"ContainerStarted","Data":"910909fb42175df7703894600e4d8797e8e667f6f40d12671c52f229c4c90791"} Apr 20 23:32:07.021093 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:07.021097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2tt9/must-gather-sxw5s" event={"ID":"f123be02-9565-47c5-8a90-d4576591042b","Type":"ContainerStarted","Data":"0486b22db964f1d76db0fea8f4561d512437139caf389022e85683954fdaed56"} Apr 20 23:32:07.036649 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:07.036598 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2tt9/must-gather-sxw5s" podStartSLOduration=1.23315526 podStartE2EDuration="2.036578627s" podCreationTimestamp="2026-04-20 23:32:05 +0000 UTC" firstStartedPulling="2026-04-20 23:32:05.740767564 +0000 UTC m=+1173.541901238" lastFinishedPulling="2026-04-20 23:32:06.544190926 +0000 UTC m=+1174.345324605" observedRunningTime="2026-04-20 23:32:07.034455397 +0000 UTC m=+1174.835589094" watchObservedRunningTime="2026-04-20 23:32:07.036578627 +0000 UTC m=+1174.837712323" Apr 20 23:32:08.191371 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:08.191343 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xs2h6_edb5064a-5a9e-425a-8ba4-f78d68e1f2a8/global-pull-secret-syncer/0.log" Apr 20 23:32:08.300882 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:08.300854 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-d5rtn_8e0fc07a-cb69-4b29-bb33-219b34f8a7ef/konnectivity-agent/0.log" Apr 20 23:32:08.322728 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:08.322696 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-251.ec2.internal_0f860586cfb743e565374c69520bb765/haproxy/0.log" Apr 20 23:32:12.798969 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:12.798937 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-lb4t4_77b407ed-9def-48e7-af38-280d038271ff/manager/0.log" Apr 20 23:32:14.319143 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.319098 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6aef86e5-4b10-45e5-aae2-d4b46c4879cc/alertmanager/0.log" Apr 20 23:32:14.339879 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.339855 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6aef86e5-4b10-45e5-aae2-d4b46c4879cc/config-reloader/0.log" Apr 20 23:32:14.362893 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.362868 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6aef86e5-4b10-45e5-aae2-d4b46c4879cc/kube-rbac-proxy-web/0.log" Apr 20 23:32:14.382302 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.382279 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6aef86e5-4b10-45e5-aae2-d4b46c4879cc/kube-rbac-proxy/0.log" Apr 20 23:32:14.403333 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.403299 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6aef86e5-4b10-45e5-aae2-d4b46c4879cc/kube-rbac-proxy-metric/0.log" Apr 20 23:32:14.424103 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.424075 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6aef86e5-4b10-45e5-aae2-d4b46c4879cc/prom-label-proxy/0.log" Apr 20 23:32:14.451636 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.451611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6aef86e5-4b10-45e5-aae2-d4b46c4879cc/init-config-reloader/0.log" Apr 20 23:32:14.519518 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.519489 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7vzql_b62cafb7-0ff7-4abf-ad06-094bdd2b3e31/kube-state-metrics/0.log" Apr 20 23:32:14.538103 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.538080 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7vzql_b62cafb7-0ff7-4abf-ad06-094bdd2b3e31/kube-rbac-proxy-main/0.log" Apr 20 23:32:14.559386 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.559332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7vzql_b62cafb7-0ff7-4abf-ad06-094bdd2b3e31/kube-rbac-proxy-self/0.log" Apr 20 23:32:14.587111 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.587030 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6bc44454dd-hlmkm_3f5b2ea0-50c2-47c7-a5e4-ab66e92b4be6/metrics-server/0.log" Apr 20 23:32:14.611630 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.611550 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-qrkkm_7b72f22d-cde7-4c91-aedc-a1ff5211db09/monitoring-plugin/0.log" Apr 20 23:32:14.644345 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.644318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8fn88_23e9fc71-0f77-4c32-a564-7860aee3bd59/node-exporter/0.log" Apr 20 23:32:14.663930 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.663901 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8fn88_23e9fc71-0f77-4c32-a564-7860aee3bd59/kube-rbac-proxy/0.log" Apr 20 23:32:14.685554 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.685419 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8fn88_23e9fc71-0f77-4c32-a564-7860aee3bd59/init-textfile/0.log" Apr 20 23:32:14.858228 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.858195 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-79qxc_77759cea-016e-42bd-9b8e-28f3fe6613aa/kube-rbac-proxy-main/0.log" Apr 20 23:32:14.900408 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.900367 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-79qxc_77759cea-016e-42bd-9b8e-28f3fe6613aa/kube-rbac-proxy-self/0.log" Apr 20 23:32:14.922681 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.922652 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-79qxc_77759cea-016e-42bd-9b8e-28f3fe6613aa/openshift-state-metrics/0.log" Apr 20 23:32:14.960935 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.960904 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_794c7306-b0dc-435f-9bd1-5359c1be1499/prometheus/0.log" Apr 20 23:32:14.979160 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:14.979127 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_794c7306-b0dc-435f-9bd1-5359c1be1499/config-reloader/0.log" Apr 20 23:32:15.003904 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.003866 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_794c7306-b0dc-435f-9bd1-5359c1be1499/thanos-sidecar/0.log" Apr 20 23:32:15.024389 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.024358 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_794c7306-b0dc-435f-9bd1-5359c1be1499/kube-rbac-proxy-web/0.log" Apr 20 23:32:15.043731 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.043708 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_794c7306-b0dc-435f-9bd1-5359c1be1499/kube-rbac-proxy/0.log" Apr 20 23:32:15.066448 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.066424 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_794c7306-b0dc-435f-9bd1-5359c1be1499/kube-rbac-proxy-thanos/0.log" Apr 20 23:32:15.091019 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.090996 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_794c7306-b0dc-435f-9bd1-5359c1be1499/init-config-reloader/0.log" Apr 20 23:32:15.117909 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.117834 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rfkh4_78b55960-2f46-41ab-a33d-7e9dd549a98c/prometheus-operator/0.log" Apr 20 23:32:15.135693 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.135668 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rfkh4_78b55960-2f46-41ab-a33d-7e9dd549a98c/kube-rbac-proxy/0.log" Apr 20 23:32:15.159422 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.159396 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-jjlrn_92035fd5-ad3c-409b-8454-202c7e10d36a/prometheus-operator-admission-webhook/0.log" Apr 20 23:32:15.256520 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.256490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9f489cd-29ctc_a74fdd55-c38d-4b4b-ba43-d7351d05d186/thanos-query/0.log" Apr 20 23:32:15.281148 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.281119 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9f489cd-29ctc_a74fdd55-c38d-4b4b-ba43-d7351d05d186/kube-rbac-proxy-web/0.log" Apr 20 23:32:15.302961 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.302938 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9f489cd-29ctc_a74fdd55-c38d-4b4b-ba43-d7351d05d186/kube-rbac-proxy/0.log" Apr 20 23:32:15.321511 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.321489 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9f489cd-29ctc_a74fdd55-c38d-4b4b-ba43-d7351d05d186/prom-label-proxy/0.log" Apr 20 23:32:15.343994 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.343968 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9f489cd-29ctc_a74fdd55-c38d-4b4b-ba43-d7351d05d186/kube-rbac-proxy-rules/0.log" Apr 20 23:32:15.364366 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:15.364344 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7dc9f489cd-29ctc_a74fdd55-c38d-4b4b-ba43-d7351d05d186/kube-rbac-proxy-metrics/0.log" Apr 20 23:32:16.440326 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.440286 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw"] Apr 20 23:32:16.446207 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.446177 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.452922 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.452897 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw"] Apr 20 23:32:16.545497 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.545442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-podres\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.545817 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.545790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-sys\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.545930 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.545842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcfzb\" (UniqueName: \"kubernetes.io/projected/487be727-f45f-4165-bd50-9b159b600f26-kube-api-access-qcfzb\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.546001 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.545986 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-proc\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.546066 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.546020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-lib-modules\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.647193 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.647151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-proc\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.647439 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.647421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-lib-modules\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.647596 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.647582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-podres\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.647763 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.647750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-sys\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.647895 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.647882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcfzb\" (UniqueName: \"kubernetes.io/projected/487be727-f45f-4165-bd50-9b159b600f26-kube-api-access-qcfzb\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.648588 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.648075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-proc\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.648737 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.648097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-lib-modules\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.648826 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.648138 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-sys\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.648900 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.648164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/487be727-f45f-4165-bd50-9b159b600f26-podres\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.657790 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.657767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcfzb\" (UniqueName: \"kubernetes.io/projected/487be727-f45f-4165-bd50-9b159b600f26-kube-api-access-qcfzb\") pod \"perf-node-gather-daemonset-gskfw\" (UID: \"487be727-f45f-4165-bd50-9b159b600f26\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.763448 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.763371 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:16.934593 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:16.934139 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw"] Apr 20 23:32:16.938908 ip-10-0-131-251 kubenswrapper[2575]: W0420 23:32:16.936759 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod487be727_f45f_4165_bd50_9b159b600f26.slice/crio-5d60658cff4a749044b5971e1fde45201502fcd1d9ef71c1d1b5794f8ecd20e6 WatchSource:0}: Error finding container 5d60658cff4a749044b5971e1fde45201502fcd1d9ef71c1d1b5794f8ecd20e6: Status 404 returned error can't find the container with id 5d60658cff4a749044b5971e1fde45201502fcd1d9ef71c1d1b5794f8ecd20e6 Apr 20 23:32:17.077589 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:17.077556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" event={"ID":"487be727-f45f-4165-bd50-9b159b600f26","Type":"ContainerStarted","Data":"5d60658cff4a749044b5971e1fde45201502fcd1d9ef71c1d1b5794f8ecd20e6"} Apr 20 23:32:17.435893 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:17.435818 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b85998796-zznzl_52c3fd37-280c-4e4e-8860-72822106bc7a/console/0.log" Apr 20 23:32:17.471230 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:17.471201 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-m5xhr_b412400c-7349-4590-ad92-575f2cb10591/download-server/0.log" Apr 20 23:32:18.083514 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:18.083452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" event={"ID":"487be727-f45f-4165-bd50-9b159b600f26","Type":"ContainerStarted","Data":"10bc58eb43bc9edbe95a29c6711b820e1978f09183f65a7df953391b3c07d0f1"} Apr 20 23:32:18.083689 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:18.083643 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:18.101817 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:18.101759 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" podStartSLOduration=2.101739554 podStartE2EDuration="2.101739554s" podCreationTimestamp="2026-04-20 23:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:32:18.099647173 +0000 UTC m=+1185.900780871" watchObservedRunningTime="2026-04-20 23:32:18.101739554 +0000 UTC m=+1185.902873251" Apr 20 23:32:18.779828 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:18.779803 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8jxc9_90d78c4f-0d8b-4012-9508-a6f166ed7d86/dns/0.log" Apr 20 23:32:18.802001 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:18.801978 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8jxc9_90d78c4f-0d8b-4012-9508-a6f166ed7d86/kube-rbac-proxy/0.log" Apr 20 23:32:18.934340 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:18.934316 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-w7nck_7f44fca8-2437-4e45-8b93-eac1d3f54370/dns-node-resolver/0.log" Apr 20 23:32:19.382015 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:19.381976 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-69bf6599f5-62ddf_dc67f46b-0422-4208-bb95-3e6e2a33d87a/registry/0.log" Apr 20 23:32:19.421580 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:19.421556 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h9hnp_f0d58ae6-6867-491f-888f-03272f7c80e7/node-ca/0.log" Apr 20 23:32:20.314560 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:20.314532 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-74c79b5d98-pkd4n_d040c532-c76a-4ff0-a162-340acb927c38/kube-auth-proxy/0.log" Apr 20 23:32:20.878248 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:20.878225 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-blz49_f78adbde-3788-4728-8a4c-fb1195350fbe/serve-healthcheck-canary/0.log" Apr 20 23:32:21.399656 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:21.399631 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7tmld_e5b2202d-031f-4096-8113-72bf2ee199f4/kube-rbac-proxy/0.log" Apr 20 23:32:21.419297 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:21.419275 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7tmld_e5b2202d-031f-4096-8113-72bf2ee199f4/exporter/0.log" Apr 20 23:32:21.438797 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:21.438776 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-7tmld_e5b2202d-031f-4096-8113-72bf2ee199f4/extractor/0.log" Apr 20 23:32:23.320343 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:23.320308 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6679f859c-2b5nf_3fb61f72-1c00-46b0-979c-3990aab49c8c/maas-api/0.log" Apr 20 23:32:23.337839 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:23.337809 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-947ff46f4-db4j7_af2c6fed-5d13-407f-a299-a4775bb80d63/manager/0.log" Apr 20 23:32:23.449793 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:23.449758 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d79c565b7-gf9d4_db0c3048-a728-4289-96f2-bdb02f8643d1/manager/0.log" Apr 20 23:32:23.492813 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:23.492764 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-68c4fbbd6f-5zhbk_2e117427-3bf5-420d-8c87-a22a6a6a543a/postgres/0.log" Apr 20 23:32:24.098618 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:24.098592 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-gskfw" Apr 20 23:32:24.578717 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:24.578682 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6577b568b8-lw2x5_21e935b3-88f5-416d-947a-44adf920902d/manager/0.log" Apr 20 23:32:30.265781 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.265753 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jncnx_92cdfbdf-902b-416d-976d-04adddd35e2b/kube-multus-additional-cni-plugins/0.log" Apr 20 23:32:30.287202 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.287179 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jncnx_92cdfbdf-902b-416d-976d-04adddd35e2b/egress-router-binary-copy/0.log" Apr 20 23:32:30.305232 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.305210 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jncnx_92cdfbdf-902b-416d-976d-04adddd35e2b/cni-plugins/0.log" Apr 20 23:32:30.323055 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.323024 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jncnx_92cdfbdf-902b-416d-976d-04adddd35e2b/bond-cni-plugin/0.log" Apr 20 23:32:30.346066 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.346035 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jncnx_92cdfbdf-902b-416d-976d-04adddd35e2b/routeoverride-cni/0.log" Apr 20 23:32:30.364360 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.364343 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jncnx_92cdfbdf-902b-416d-976d-04adddd35e2b/whereabouts-cni-bincopy/0.log" Apr 20 23:32:30.384413 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.384392 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jncnx_92cdfbdf-902b-416d-976d-04adddd35e2b/whereabouts-cni/0.log" Apr 20 23:32:30.581913 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.581889 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-l4jz9_41e804c8-cacd-4b38-ba49-6a0ee8e095cf/kube-multus/0.log" Apr 20 23:32:30.641883 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.641857 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qklww_0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd/network-metrics-daemon/0.log" Apr 20 23:32:30.658365 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:30.658346 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qklww_0efe962b-3a09-4fd9-99ca-a2e1bb52f3dd/kube-rbac-proxy/0.log" Apr 20 23:32:32.044035 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:32.044005 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkbxg_172d49a4-6e7e-4772-9e06-73cad0eec748/ovn-controller/0.log" Apr 20 23:32:32.064525 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:32.064504 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkbxg_172d49a4-6e7e-4772-9e06-73cad0eec748/ovn-acl-logging/0.log" Apr 20 23:32:32.078993 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:32.078974 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkbxg_172d49a4-6e7e-4772-9e06-73cad0eec748/kube-rbac-proxy-node/0.log" Apr 20 23:32:32.096977 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:32.096953 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkbxg_172d49a4-6e7e-4772-9e06-73cad0eec748/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 23:32:32.114923 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:32.114907 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkbxg_172d49a4-6e7e-4772-9e06-73cad0eec748/northd/0.log" Apr 20 23:32:32.132476 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:32.132454 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkbxg_172d49a4-6e7e-4772-9e06-73cad0eec748/nbdb/0.log" Apr 20 23:32:32.151676 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:32.151653 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkbxg_172d49a4-6e7e-4772-9e06-73cad0eec748/sbdb/0.log" Apr 20 23:32:32.266662 ip-10-0-131-251 kubenswrapper[2575]: I0420 23:32:32.266633 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lkbxg_172d49a4-6e7e-4772-9e06-73cad0eec748/ovnkube-controller/0.log"