Apr 20 22:24:32.207472 ip-10-0-133-201 systemd[1]: Starting Kubernetes Kubelet... Apr 20 22:24:32.648060 ip-10-0-133-201 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 22:24:32.648060 ip-10-0-133-201 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 22:24:32.648060 ip-10-0-133-201 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 22:24:32.648060 ip-10-0-133-201 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 22:24:32.648060 ip-10-0-133-201 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 22:24:32.650804 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.650700 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 22:24:32.654963 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654936 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:32.654963 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654962 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:32.654963 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654968 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654971 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654975 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654978 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654981 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654984 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654987 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654991 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654994 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.654997 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655001 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655004 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655007 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655010 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655013 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655015 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655018 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655021 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655023 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655026 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:32.655074 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655028 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655031 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655034 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655036 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655039 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655041 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655044 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655047 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655049 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655053 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655057 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655060 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655064 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655067 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655070 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655073 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655076 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655078 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655081 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:32.655564 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655084 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655087 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655090 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655092 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655095 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655098 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655101 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655103 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655106 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655108 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655111 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655114 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655117 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655119 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655122 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655125 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655128 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655131 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655133 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655136 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:32.656017 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655138 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655141 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655143 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655160 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655164 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655169 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655174 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655179 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655184 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655188 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655191 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655194 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655197 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655200 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655203 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655206 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655209 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655212 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655216 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:32.656525 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655219 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655222 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655224 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655227 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655229 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655232 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655701 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655707 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655709 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655712 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655715 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655718 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655721 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655724 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655726 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655729 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655731 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655734 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655737 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655746 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:32.657081 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655750 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655753 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655755 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655758 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655761 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655763 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655766 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655769 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655771 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655773 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655776 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655778 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655781 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655784 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655786 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655789 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655791 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655794 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655796 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:32.657585 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655799 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655802 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655804 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655807 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655809 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655812 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655814 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655817 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655819 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655822 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655825 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655828 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655830 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655840 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655843 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655846 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655848 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655850 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655853 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655856 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:32.658056 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655859 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655861 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655864 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655866 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655869 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655871 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655874 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655876 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655879 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655882 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655886 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655889 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655891 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655894 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655896 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655900 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655902 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655905 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655907 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655911 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:32.658562 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655913 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655916 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655920 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655923 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655927 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655930 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655938 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655941 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655943 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655946 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655949 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655951 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.655954 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657679 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657691 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657701 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657706 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657711 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657714 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657719 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657723 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 22:24:32.659049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657727 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657730 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657733 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657737 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657740 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657743 2568 flags.go:64] FLAG: --cgroup-root="" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657746 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657749 2568 flags.go:64] FLAG: --client-ca-file="" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657752 2568 flags.go:64] FLAG: --cloud-config="" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657755 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657759 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657767 2568 flags.go:64] FLAG: --cluster-domain="" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657770 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657773 2568 flags.go:64] FLAG: --config-dir="" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657776 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657780 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657784 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657788 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657799 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657803 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657806 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657809 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657812 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657816 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657819 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 22:24:32.659590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657823 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657826 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657833 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657836 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657839 2568 flags.go:64] FLAG: --enable-server="true" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657842 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657849 2568 flags.go:64] FLAG: --event-burst="100" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657852 2568 flags.go:64] FLAG: --event-qps="50" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657855 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657858 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657861 2568 flags.go:64] FLAG: --eviction-hard="" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657866 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657869 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657872 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657875 2568 flags.go:64] FLAG: --eviction-soft="" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657879 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657882 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657885 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657888 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657892 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657895 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657898 2568 flags.go:64] FLAG: --feature-gates="" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657902 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657906 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657909 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 22:24:32.660244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657913 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657922 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657925 2568 flags.go:64] FLAG: --help="false" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657928 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-133-201.ec2.internal" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657932 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657935 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657938 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657942 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657946 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657949 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657952 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657954 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657957 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657961 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657964 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657967 2568 flags.go:64] FLAG: --kube-reserved="" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657970 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657973 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657976 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657979 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657982 2568 flags.go:64] FLAG: --lock-file="" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657985 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657988 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.657992 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 22:24:32.660877 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658002 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658005 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658008 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658011 2568 flags.go:64] FLAG: --logging-format="text" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658014 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658017 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658020 2568 flags.go:64] FLAG: --manifest-url="" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658023 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658027 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658031 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658043 2568 flags.go:64] FLAG: --max-pods="110" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658046 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658049 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658052 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658055 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658058 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658062 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658065 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658073 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658077 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658080 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658083 2568 flags.go:64] FLAG: --pod-cidr="" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658086 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 22:24:32.661477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658093 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658096 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658100 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658102 2568 flags.go:64] FLAG: --port="10250" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658105 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658108 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c8fc7f13872fc9b6" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658112 2568 flags.go:64] FLAG: --qos-reserved="" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658115 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658118 2568 flags.go:64] FLAG: --register-node="true" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658122 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658125 2568 flags.go:64] FLAG: --register-with-taints="" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658129 2568 flags.go:64] FLAG: --registry-burst="10" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658133 2568 flags.go:64] FLAG: --registry-qps="5" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658135 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658138 2568 flags.go:64] FLAG: --reserved-memory="" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658142 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658145 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658160 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658163 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658166 2568 flags.go:64] FLAG: --runonce="false" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658175 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658179 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658182 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658185 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658188 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658191 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 22:24:32.662061 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658194 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658197 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658200 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658203 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658207 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658210 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658213 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658216 2568 flags.go:64] FLAG: --system-cgroups="" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658219 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658225 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658228 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658231 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658238 2568 flags.go:64] FLAG: --tls-min-version="" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658241 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658243 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658247 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658250 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658253 2568 flags.go:64] FLAG: --v="2" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658258 2568 flags.go:64] FLAG: --version="false" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658262 2568 flags.go:64] FLAG: --vmodule="" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658267 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.658270 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658409 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658413 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:32.662701 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658416 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658419 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658422 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658430 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658433 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658436 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658439 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658441 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658444 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658446 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658449 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658451 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658455 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658458 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658460 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658463 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658466 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658469 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658471 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658474 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:32.663293 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658476 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658480 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658482 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658484 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658487 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658490 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658492 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658495 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658497 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658500 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658502 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658504 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658509 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658512 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658515 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658518 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658526 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658535 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658538 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658540 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:32.663830 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658543 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658546 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658550 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658554 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658560 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658563 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658566 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658569 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658571 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658574 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658577 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658580 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658582 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658585 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658587 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658590 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658592 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658595 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658597 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658600 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:32.664354 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658603 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658605 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658608 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658610 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658613 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658616 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658618 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658621 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658623 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658632 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658635 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658637 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658640 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658642 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658645 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658647 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658651 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658654 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658656 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658659 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:32.664876 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658661 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:32.665475 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658663 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:32.665475 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658666 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:32.665475 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.658669 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:32.665475 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.659448 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 22:24:32.667449 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.667421 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 22:24:32.667449 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.667446 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667499 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667505 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667508 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667511 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667515 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667518 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667521 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667524 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667527 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667529 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667533 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667535 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667538 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667540 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667543 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667546 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667548 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667551 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667553 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:32.667592 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667556 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667558 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667561 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667563 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667566 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667568 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667571 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667575 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667577 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667580 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667583 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667586 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667588 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667591 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667595 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667598 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667600 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667603 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667606 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:32.668086 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667609 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667611 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667614 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667617 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667619 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667622 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667624 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667627 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667630 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667632 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667635 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667637 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667640 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667642 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667645 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667647 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667650 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667653 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667655 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667658 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:32.668584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667660 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667664 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667666 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667669 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667671 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667674 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667677 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667682 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667687 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667690 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667693 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667696 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667699 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667701 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667704 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667707 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667710 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667713 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667716 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667720 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:32.669104 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667723 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667725 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667729 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667733 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667736 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667738 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667742 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667745 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.667750 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667850 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667856 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667859 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667862 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667865 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667868 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:24:32.669606 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667872 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667876 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667879 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667882 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667886 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667889 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667892 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667895 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667898 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667900 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667903 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667906 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667909 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667911 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667914 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667917 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667921 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667923 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667926 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667929 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:24:32.669974 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667931 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667934 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667937 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667939 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667942 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667944 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667947 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667949 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667952 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667955 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667957 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667962 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667965 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667968 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667970 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667973 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667975 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667978 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667981 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667984 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:24:32.670488 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667986 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667989 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667992 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667994 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667996 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.667999 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668002 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668004 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668007 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668010 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668013 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668015 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668018 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668020 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668023 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668025 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668028 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668030 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668033 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668036 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:24:32.670983 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668038 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668041 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668043 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668046 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668048 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668051 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668053 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668056 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668058 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668061 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668064 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668067 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668070 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668072 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668075 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668077 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668080 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668082 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668085 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:24:32.671500 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:32.668087 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:24:32.671972 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.668093 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 22:24:32.671972 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.668733 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 22:24:32.672981 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.672962 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 22:24:32.673895 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.673882 2568 server.go:1019] "Starting client certificate rotation" Apr 20 22:24:32.674007 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.673985 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 22:24:32.674049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.674037 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 22:24:32.700040 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.700011 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 22:24:32.704420 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.704397 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 22:24:32.721522 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.721488 2568 log.go:25] "Validated CRI v1 runtime API" Apr 20 22:24:32.727721 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.727694 2568 log.go:25] "Validated CRI v1 image API" Apr 20 22:24:32.729560 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.729541 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 22:24:32.730731 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.730707 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 22:24:32.732200 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.732177 2568 fs.go:135] Filesystem UUIDs: map[011d9029-83fc-477c-9de0-13e7a9f87915:/dev/nvme0n1p4 5a1a5487-43bf-420c-a228-ea82d84d2c56:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 20 22:24:32.732259 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.732200 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 22:24:32.739542 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.739399 2568 manager.go:217] Machine: {Timestamp:2026-04-20 22:24:32.737341814 +0000 UTC m=+0.410529490 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3113703 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24c93e27fa03c3b8afbbaf7bee8bcc SystemUUID:ec24c93e-27fa-03c3-b8af-bbaf7bee8bcc BootID:1b6655ce-48b1-4007-a472-57ea90d70cea Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4b:52:45:a3:cf Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4b:52:45:a3:cf Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:bb:a4:08:eb:f3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 22:24:32.739542 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.739522 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 22:24:32.739722 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.739626 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 22:24:32.740704 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.740671 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 22:24:32.740861 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.740707 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-201.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 22:24:32.740909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.740871 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 22:24:32.740909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.740881 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 22:24:32.740909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.740895 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 22:24:32.741632 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.741620 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 22:24:32.742431 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.742396 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 20 22:24:32.742556 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.742547 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 22:24:32.745395 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.745374 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b29bw" Apr 20 22:24:32.745435 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.745427 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 20 22:24:32.745475 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.745439 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 22:24:32.745475 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.745452 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 22:24:32.745475 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.745462 2568 kubelet.go:397] "Adding apiserver pod source" Apr 20 22:24:32.745475 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.745472 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 22:24:32.746791 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.746776 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 22:24:32.746841 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.746797 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 22:24:32.750247 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.750228 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 22:24:32.751674 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.751655 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 22:24:32.752628 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.752611 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b29bw" Apr 20 22:24:32.753600 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753587 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 22:24:32.753661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753609 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 22:24:32.753661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753620 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 22:24:32.753661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753628 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 22:24:32.753661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753634 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 22:24:32.753661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753640 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 22:24:32.753661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753647 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 22:24:32.753661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753654 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 22:24:32.753661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753661 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 22:24:32.753864 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753667 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 22:24:32.753864 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753692 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 22:24:32.753864 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.753702 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 22:24:32.755378 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.755363 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 22:24:32.755378 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.755377 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 22:24:32.757941 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.757920 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:32.759333 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.759317 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 22:24:32.759413 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.759360 2568 server.go:1295] "Started kubelet" Apr 20 22:24:32.759507 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.759473 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 22:24:32.759559 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.759511 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 22:24:32.759589 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.759581 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 22:24:32.760196 ip-10-0-133-201 systemd[1]: Started Kubernetes Kubelet. Apr 20 22:24:32.760311 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.760293 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:32.761414 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.761396 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 22:24:32.762071 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.762057 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 20 22:24:32.763653 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.763634 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-201.ec2.internal" not found Apr 20 22:24:32.765811 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.765779 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 22:24:32.766253 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.766236 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 22:24:32.766870 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.766843 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 22:24:32.767081 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:32.767050 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-201.ec2.internal\" not found" Apr 20 22:24:32.767202 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767183 2568 factory.go:55] Registering systemd factory Apr 20 22:24:32.767270 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767208 2568 factory.go:223] Registration of the systemd container factory successfully Apr 20 22:24:32.767270 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767231 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 22:24:32.767270 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767249 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 22:24:32.767484 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767466 2568 factory.go:153] Registering CRI-O factory Apr 20 22:24:32.767544 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767487 2568 factory.go:223] Registration of the crio container factory successfully Apr 20 22:24:32.767596 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767544 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 22:24:32.767596 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767569 2568 factory.go:103] Registering Raw factory Apr 20 22:24:32.767596 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767583 2568 manager.go:1196] Started watching for new ooms in manager Apr 20 22:24:32.767778 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767764 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 20 22:24:32.767855 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.767837 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 20 22:24:32.768177 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.768143 2568 manager.go:319] Starting recovery of all containers Apr 20 22:24:32.769206 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.769184 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:32.769473 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:32.769231 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 22:24:32.771889 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:32.771860 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-201.ec2.internal\" not found" node="ip-10-0-133-201.ec2.internal" Apr 20 22:24:32.776123 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.776082 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 22:24:32.779342 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.779319 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-201.ec2.internal" not found Apr 20 22:24:32.781693 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.781405 2568 manager.go:324] Recovery completed Apr 20 22:24:32.786346 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.786331 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:24:32.788391 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.788375 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-201.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:24:32.788468 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.788406 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:24:32.788468 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.788420 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-201.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:24:32.788955 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.788943 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 22:24:32.788955 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.788954 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 22:24:32.789028 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.788971 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 20 22:24:32.792278 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.792260 2568 policy_none.go:49] "None policy: Start" Apr 20 22:24:32.792340 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.792282 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 22:24:32.792340 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.792293 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 20 22:24:32.839809 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.839785 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-201.ec2.internal" not found Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.839815 2568 manager.go:341] "Starting Device Plugin manager" Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:32.839862 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.839875 2568 server.go:85] "Starting device plugin registration server" Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.840259 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.840272 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.840840 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.840944 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.840953 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:32.842073 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 22:24:32.861826 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:32.842116 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-201.ec2.internal\" not found" Apr 20 22:24:32.908200 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.908106 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 22:24:32.908200 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.908172 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 22:24:32.908371 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.908206 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 22:24:32.908371 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.908214 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 22:24:32.908371 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:32.908255 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 22:24:32.911756 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.911733 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:32.941397 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.941363 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:24:32.942790 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.942771 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-201.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:24:32.942894 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.942804 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:24:32.942894 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.942818 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-201.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:24:32.942894 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.942841 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-201.ec2.internal" Apr 20 22:24:32.951309 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:32.951292 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-201.ec2.internal" Apr 20 22:24:32.951374 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:32.951316 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-201.ec2.internal\": node \"ip-10-0-133-201.ec2.internal\" not found" Apr 20 22:24:33.008892 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.008841 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal"] Apr 20 22:24:33.013302 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.013282 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.013302 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.013294 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.045031 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.045008 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.049591 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.049574 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.058025 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.058003 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 22:24:33.061017 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.060997 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 22:24:33.069299 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.069268 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f5a116cbed3e163e10f394a1de03350e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal\" (UID: \"f5a116cbed3e163e10f394a1de03350e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.069412 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.069304 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5a116cbed3e163e10f394a1de03350e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal\" (UID: \"f5a116cbed3e163e10f394a1de03350e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.069412 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.069333 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/674e8501a0d803973607282eda51f055-config\") pod \"kube-apiserver-proxy-ip-10-0-133-201.ec2.internal\" (UID: \"674e8501a0d803973607282eda51f055\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.169720 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.169635 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5a116cbed3e163e10f394a1de03350e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal\" (UID: \"f5a116cbed3e163e10f394a1de03350e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.169720 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.169669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/674e8501a0d803973607282eda51f055-config\") pod \"kube-apiserver-proxy-ip-10-0-133-201.ec2.internal\" (UID: \"674e8501a0d803973607282eda51f055\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.169720 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.169687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f5a116cbed3e163e10f394a1de03350e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal\" (UID: \"f5a116cbed3e163e10f394a1de03350e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.169923 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.169729 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/674e8501a0d803973607282eda51f055-config\") pod \"kube-apiserver-proxy-ip-10-0-133-201.ec2.internal\" (UID: \"674e8501a0d803973607282eda51f055\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.169923 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.169735 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f5a116cbed3e163e10f394a1de03350e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal\" (UID: \"f5a116cbed3e163e10f394a1de03350e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.169923 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.169776 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f5a116cbed3e163e10f394a1de03350e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal\" (UID: \"f5a116cbed3e163e10f394a1de03350e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.361092 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.361045 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.363516 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.363493 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" Apr 20 22:24:33.674731 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.674692 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 22:24:33.675333 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.674875 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 22:24:33.675333 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.674898 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 22:24:33.675333 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.674912 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 22:24:33.745803 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.745762 2568 apiserver.go:52] "Watching apiserver" Apr 20 22:24:33.753079 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.753052 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 22:24:33.753449 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.753422 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk","openshift-image-registry/node-ca-j77rr","openshift-network-diagnostics/network-check-target-shrcq","openshift-network-operator/iptables-alerter-shw5t","openshift-ovn-kubernetes/ovnkube-node-r2zfd","kube-system/konnectivity-agent-t89tm","kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal","openshift-cluster-node-tuning-operator/tuned-n2cxv","openshift-dns/node-resolver-6cs9w","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal","openshift-multus/multus-additional-cni-plugins-4npn9","openshift-multus/multus-snx6p","openshift-multus/network-metrics-daemon-5kgfv"] Apr 20 22:24:33.754307 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.754277 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 22:19:32 +0000 UTC" deadline="2027-10-30 07:38:08.244313354 +0000 UTC" Apr 20 22:24:33.754392 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.754303 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13377h13m34.490013058s" Apr 20 22:24:33.756483 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.756459 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:33.758727 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.758701 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 22:24:33.758832 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.758775 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 22:24:33.758832 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.758788 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5j57c\"" Apr 20 22:24:33.758832 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.758775 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:33.758970 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:33.758910 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:33.760848 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.760827 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:33.760961 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:33.760892 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:33.763531 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.763504 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.765719 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.765698 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.765919 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.765897 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 22:24:33.766309 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.766295 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 22:24:33.766636 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.766617 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:33.766636 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.766617 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 22:24:33.766770 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.766653 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qdkbs\"" Apr 20 22:24:33.767868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.767849 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 22:24:33.767973 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.767869 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.767973 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.767902 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qg7l5\"" Apr 20 22:24:33.768085 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.767975 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 22:24:33.769893 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.769857 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 22:24:33.770206 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.770188 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-hzkrk\"" Apr 20 22:24:33.770308 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.770207 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.770308 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.770209 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 22:24:33.770425 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.770282 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 22:24:33.772676 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772653 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-registration-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.772774 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772688 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-device-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.772774 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772715 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnvj5\" (UniqueName: \"kubernetes.io/projected/ffc7a641-db0a-4528-b993-5560e673b5d5-kube-api-access-cnvj5\") pod \"iptables-alerter-shw5t\" (UID: \"ffc7a641-db0a-4528-b993-5560e673b5d5\") " pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.772774 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-sys-fs\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.772926 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772784 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d33395f3-9f3c-4562-ad4d-b1058d8551bf-konnectivity-ca\") pod \"konnectivity-agent-t89tm\" (UID: \"d33395f3-9f3c-4562-ad4d-b1058d8551bf\") " pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:33.772926 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772789 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nnvlk\"" Apr 20 22:24:33.772926 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772807 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:33.772926 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772830 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.773106 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772925 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 22:24:33.773106 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.773009 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:33.773106 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.772832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ffc7a641-db0a-4528-b993-5560e673b5d5-iptables-alerter-script\") pod \"iptables-alerter-shw5t\" (UID: \"ffc7a641-db0a-4528-b993-5560e673b5d5\") " pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.773244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.773106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa2a5c1d-a5d3-4341-8c92-2a050066670f-tmp-dir\") pod \"node-resolver-6cs9w\" (UID: \"fa2a5c1d-a5d3-4341-8c92-2a050066670f\") " pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.773966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.773624 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.773966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.773670 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp7kd\" (UniqueName: \"kubernetes.io/projected/fa2a5c1d-a5d3-4341-8c92-2a050066670f-kube-api-access-cp7kd\") pod \"node-resolver-6cs9w\" (UID: \"fa2a5c1d-a5d3-4341-8c92-2a050066670f\") " pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.773966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.773706 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-etc-selinux\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.773966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.773741 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2r4c\" (UniqueName: \"kubernetes.io/projected/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-kube-api-access-d2r4c\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.773966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.773770 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d33395f3-9f3c-4562-ad4d-b1058d8551bf-agent-certs\") pod \"konnectivity-agent-t89tm\" (UID: \"d33395f3-9f3c-4562-ad4d-b1058d8551bf\") " pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:33.773966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.773856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:33.773966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.773966 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrb7\" (UniqueName: \"kubernetes.io/projected/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-kube-api-access-kfrb7\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:33.774514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.774013 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffc7a641-db0a-4528-b993-5560e673b5d5-host-slash\") pod \"iptables-alerter-shw5t\" (UID: \"ffc7a641-db0a-4528-b993-5560e673b5d5\") " pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.774514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.774260 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa2a5c1d-a5d3-4341-8c92-2a050066670f-hosts-file\") pod \"node-resolver-6cs9w\" (UID: \"fa2a5c1d-a5d3-4341-8c92-2a050066670f\") " pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.774514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.774308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-socket-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.775276 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.775113 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 22:24:33.775523 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.775504 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.776049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.776019 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pf45g\"" Apr 20 22:24:33.776239 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.776220 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 22:24:33.776377 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.776360 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 22:24:33.776449 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.776415 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 22:24:33.777050 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.777019 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 22:24:33.777302 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.777284 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 22:24:33.778324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.778307 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7bhr2\"" Apr 20 22:24:33.778458 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.778343 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 22:24:33.778458 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.778367 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.778458 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.778415 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 22:24:33.778697 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.778682 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 22:24:33.778770 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.778713 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 22:24:33.778983 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.778964 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 22:24:33.780566 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.780535 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 22:24:33.780693 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.780592 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.780982 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.780967 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 22:24:33.781066 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.781012 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n5pr5\"" Apr 20 22:24:33.783055 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.783036 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 22:24:33.783219 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.783206 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 22:24:33.783298 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.783283 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-n79gs\"" Apr 20 22:24:33.783589 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.783574 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 22:24:33.803072 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.803041 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-th2sh" Apr 20 22:24:33.810705 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.810676 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-th2sh" Apr 20 22:24:33.867735 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.867708 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 22:24:33.874892 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.874870 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-cnibin\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.874949 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.874899 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-run-netns\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.874949 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.874919 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-conf-dir\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.874949 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.874939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ffc7a641-db0a-4528-b993-5560e673b5d5-iptables-alerter-script\") pod \"iptables-alerter-shw5t\" (UID: \"ffc7a641-db0a-4528-b993-5560e673b5d5\") " pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.875036 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.874991 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa2a5c1d-a5d3-4341-8c92-2a050066670f-tmp-dir\") pod \"node-resolver-6cs9w\" (UID: \"fa2a5c1d-a5d3-4341-8c92-2a050066670f\") " pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.875081 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875063 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-sysctl-d\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.875133 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875105 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-systemd\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.875190 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f3be14a-c6d2-4e17-88f8-9129b465bd71-env-overrides\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.875222 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875193 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f3be14a-c6d2-4e17-88f8-9129b465bd71-ovnkube-script-lib\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.875256 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875239 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp7kd\" (UniqueName: \"kubernetes.io/projected/fa2a5c1d-a5d3-4341-8c92-2a050066670f-kube-api-access-cp7kd\") pod \"node-resolver-6cs9w\" (UID: \"fa2a5c1d-a5d3-4341-8c92-2a050066670f\") " pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.875292 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-etc-selinux\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.875331 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrb7\" (UniqueName: \"kubernetes.io/projected/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-kube-api-access-kfrb7\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:33.875331 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875325 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-lib-modules\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.875407 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-var-lib-openvswitch\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.875407 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875366 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa2a5c1d-a5d3-4341-8c92-2a050066670f-tmp-dir\") pod \"node-resolver-6cs9w\" (UID: \"fa2a5c1d-a5d3-4341-8c92-2a050066670f\") " pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.875407 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-etc-kubernetes\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.875407 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875376 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-etc-selinux\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.875575 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-registration-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.875575 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875453 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-device-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.875575 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875481 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thlml\" (UniqueName: \"kubernetes.io/projected/5af6d22e-6f75-493a-a5ab-9f2a0eafa36f-kube-api-access-thlml\") pod \"node-ca-j77rr\" (UID: \"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f\") " pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.875575 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-tmp\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.875575 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-device-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.875575 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875527 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-run-systemd\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.875575 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875545 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ffc7a641-db0a-4528-b993-5560e673b5d5-iptables-alerter-script\") pod \"iptables-alerter-shw5t\" (UID: \"ffc7a641-db0a-4528-b993-5560e673b5d5\") " pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.875575 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875555 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f3be14a-c6d2-4e17-88f8-9129b465bd71-ovnkube-config\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.875870 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875607 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f3be14a-c6d2-4e17-88f8-9129b465bd71-ovn-node-metrics-cert\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.875870 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875650 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.875870 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-log-socket\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.875870 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875763 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-system-cni-dir\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.875870 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875780 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-run-k8s-cni-cncf-io\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.875870 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875795 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-daemon-config\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.875870 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875817 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d33395f3-9f3c-4562-ad4d-b1058d8551bf-agent-certs\") pod \"konnectivity-agent-t89tm\" (UID: \"d33395f3-9f3c-4562-ad4d-b1058d8551bf\") " pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:33.875870 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875839 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875873 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5af6d22e-6f75-493a-a5ab-9f2a0eafa36f-host\") pod \"node-ca-j77rr\" (UID: \"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f\") " pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875930 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5af6d22e-6f75-493a-a5ab-9f2a0eafa36f-serviceca\") pod \"node-ca-j77rr\" (UID: \"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f\") " pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875484 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-registration-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875957 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-cni-bin\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:33.875973 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.875998 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbz7p\" (UniqueName: \"kubernetes.io/projected/9f3be14a-c6d2-4e17-88f8-9129b465bd71-kube-api-access-pbz7p\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:33.876084 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs podName:a6b93d87-66d5-4f06-b428-6cbc7fcdeda2 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:34.376056267 +0000 UTC m=+2.049243941 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs") pod "network-metrics-daemon-5kgfv" (UID: "a6b93d87-66d5-4f06-b428-6cbc7fcdeda2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876098 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3a6b05e-7ccd-4812-b0a3-5860098b7618-cni-binary-copy\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876115 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa2a5c1d-a5d3-4341-8c92-2a050066670f-hosts-file\") pod \"node-resolver-6cs9w\" (UID: \"fa2a5c1d-a5d3-4341-8c92-2a050066670f\") " pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.876164 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-kubernetes\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876205 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876223 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa2a5c1d-a5d3-4341-8c92-2a050066670f-hosts-file\") pod \"node-resolver-6cs9w\" (UID: \"fa2a5c1d-a5d3-4341-8c92-2a050066670f\") " pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-host\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876262 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-kubelet\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876298 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-systemd-units\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876281 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876325 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876349 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-os-release\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876382 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnvj5\" (UniqueName: \"kubernetes.io/projected/ffc7a641-db0a-4528-b993-5560e673b5d5-kube-api-access-cnvj5\") pod \"iptables-alerter-shw5t\" (UID: \"ffc7a641-db0a-4528-b993-5560e673b5d5\") " pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876440 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d33395f3-9f3c-4562-ad4d-b1058d8551bf-konnectivity-ca\") pod \"konnectivity-agent-t89tm\" (UID: \"d33395f3-9f3c-4562-ad4d-b1058d8551bf\") " pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-var-lib-kubelet\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876489 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-os-release\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876618 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-modprobe-d\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876645 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-sysconfig\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876674 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-slash\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.876739 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876704 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-node-log\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876730 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6h6d\" (UniqueName: \"kubernetes.io/projected/e3a6b05e-7ccd-4812-b0a3-5860098b7618-kube-api-access-j6h6d\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876753 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-var-lib-cni-multus\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-var-lib-kubelet\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876795 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-hostroot\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-tuned\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876862 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7hhs\" (UniqueName: \"kubernetes.io/projected/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-kube-api-access-v7hhs\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876921 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-cni-netd\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876927 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d33395f3-9f3c-4562-ad4d-b1058d8551bf-konnectivity-ca\") pod \"konnectivity-agent-t89tm\" (UID: \"d33395f3-9f3c-4562-ad4d-b1058d8551bf\") " pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.876966 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-socket-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877004 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877037 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-var-lib-cni-bin\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877074 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-socket-dir\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877069 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-run-multus-certs\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-sysctl-conf\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877184 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8dc6\" (UniqueName: \"kubernetes.io/projected/99b45f86-8fd1-4884-ab21-716f105f2d77-kube-api-access-z8dc6\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.877363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877211 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-sys\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877237 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-etc-openvswitch\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877264 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-run-ovn-kubernetes\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e3a6b05e-7ccd-4812-b0a3-5860098b7618-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-socket-dir-parent\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877371 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2r4c\" (UniqueName: \"kubernetes.io/projected/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-kube-api-access-d2r4c\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877401 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e3a6b05e-7ccd-4812-b0a3-5860098b7618-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877510 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99b45f86-8fd1-4884-ab21-716f105f2d77-cni-binary-copy\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffc7a641-db0a-4528-b993-5560e673b5d5-host-slash\") pod \"iptables-alerter-shw5t\" (UID: \"ffc7a641-db0a-4528-b993-5560e673b5d5\") " pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-sys-fs\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877586 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffc7a641-db0a-4528-b993-5560e673b5d5-host-slash\") pod \"iptables-alerter-shw5t\" (UID: \"ffc7a641-db0a-4528-b993-5560e673b5d5\") " pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877601 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-run\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877624 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-run-netns\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877633 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-sys-fs\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877646 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-run-openvswitch\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877672 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-run-ovn\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.877917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877713 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-cnibin\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.878377 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-system-cni-dir\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.878377 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.877797 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-cni-dir\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.879638 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.879615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d33395f3-9f3c-4562-ad4d-b1058d8551bf-agent-certs\") pod \"konnectivity-agent-t89tm\" (UID: \"d33395f3-9f3c-4562-ad4d-b1058d8551bf\") " pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:33.885584 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:33.885466 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod674e8501a0d803973607282eda51f055.slice/crio-cca293240ff4d1f74fa303250f0f1bbb04d1daa912634368ef40bf92bfda6409 WatchSource:0}: Error finding container cca293240ff4d1f74fa303250f0f1bbb04d1daa912634368ef40bf92bfda6409: Status 404 returned error can't find the container with id cca293240ff4d1f74fa303250f0f1bbb04d1daa912634368ef40bf92bfda6409 Apr 20 22:24:33.885790 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:33.885769 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5a116cbed3e163e10f394a1de03350e.slice/crio-f90a52de415e1a228c5aa1299ddb0171e0de176034e6e21a0c71218d47a3aa34 WatchSource:0}: Error finding container f90a52de415e1a228c5aa1299ddb0171e0de176034e6e21a0c71218d47a3aa34: Status 404 returned error can't find the container with id f90a52de415e1a228c5aa1299ddb0171e0de176034e6e21a0c71218d47a3aa34 Apr 20 22:24:33.886093 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:33.886071 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:33.886249 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:33.886230 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:33.886249 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:33.886251 2568 projected.go:194] Error preparing data for projected volume kube-api-access-cz2jv for pod openshift-network-diagnostics/network-check-target-shrcq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:33.886430 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:33.886340 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv podName:a761dc14-770d-43e4-b87c-68589f057961 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:34.386319698 +0000 UTC m=+2.059507365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cz2jv" (UniqueName: "kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv") pod "network-check-target-shrcq" (UID: "a761dc14-770d-43e4-b87c-68589f057961") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:33.888734 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.888704 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnvj5\" (UniqueName: \"kubernetes.io/projected/ffc7a641-db0a-4528-b993-5560e673b5d5-kube-api-access-cnvj5\") pod \"iptables-alerter-shw5t\" (UID: \"ffc7a641-db0a-4528-b993-5560e673b5d5\") " pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:33.889073 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.889044 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp7kd\" (UniqueName: \"kubernetes.io/projected/fa2a5c1d-a5d3-4341-8c92-2a050066670f-kube-api-access-cp7kd\") pod \"node-resolver-6cs9w\" (UID: \"fa2a5c1d-a5d3-4341-8c92-2a050066670f\") " pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:33.889203 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.889077 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrb7\" (UniqueName: \"kubernetes.io/projected/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-kube-api-access-kfrb7\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:33.891631 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.891578 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2r4c\" (UniqueName: \"kubernetes.io/projected/9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a-kube-api-access-d2r4c\") pod \"aws-ebs-csi-driver-node-5vfbk\" (UID: \"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:33.894107 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.894088 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:24:33.911568 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.911518 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal" event={"ID":"674e8501a0d803973607282eda51f055","Type":"ContainerStarted","Data":"cca293240ff4d1f74fa303250f0f1bbb04d1daa912634368ef40bf92bfda6409"} Apr 20 22:24:33.912432 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.912409 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" event={"ID":"f5a116cbed3e163e10f394a1de03350e","Type":"ContainerStarted","Data":"f90a52de415e1a228c5aa1299ddb0171e0de176034e6e21a0c71218d47a3aa34"} Apr 20 22:24:33.979117 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979078 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e3a6b05e-7ccd-4812-b0a3-5860098b7618-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.979117 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99b45f86-8fd1-4884-ab21-716f105f2d77-cni-binary-copy\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979137 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-run\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979164 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-run-netns\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979188 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-run-openvswitch\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979208 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-run-ovn\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979234 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-cnibin\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979237 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-run\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-system-cni-dir\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979279 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-cni-dir\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979296 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-cnibin\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979260 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-run-netns\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979303 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-run-openvswitch\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979315 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-run-netns\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979352 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-run-netns\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979382 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-conf-dir\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979398 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-cnibin\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979414 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-sysctl-d\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-run-ovn\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979457 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-systemd\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979464 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-cni-dir\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979504 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-cnibin\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979508 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-system-cni-dir\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979516 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-conf-dir\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979531 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f3be14a-c6d2-4e17-88f8-9129b465bd71-env-overrides\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979550 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-systemd\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979555 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-sysctl-d\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979559 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f3be14a-c6d2-4e17-88f8-9129b465bd71-ovnkube-script-lib\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979598 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-lib-modules\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.979763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979625 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-var-lib-openvswitch\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979652 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-etc-kubernetes\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979697 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-var-lib-openvswitch\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979713 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e3a6b05e-7ccd-4812-b0a3-5860098b7618-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thlml\" (UniqueName: \"kubernetes.io/projected/5af6d22e-6f75-493a-a5ab-9f2a0eafa36f-kube-api-access-thlml\") pod \"node-ca-j77rr\" (UID: \"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f\") " pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979750 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-tmp\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979744 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99b45f86-8fd1-4884-ab21-716f105f2d77-cni-binary-copy\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979771 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-run-systemd\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979794 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-lib-modules\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979798 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f3be14a-c6d2-4e17-88f8-9129b465bd71-ovnkube-config\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979849 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f3be14a-c6d2-4e17-88f8-9129b465bd71-ovn-node-metrics-cert\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979873 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-log-socket\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-system-cni-dir\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979919 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-run-k8s-cni-cncf-io\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979942 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-daemon-config\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979976 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5af6d22e-6f75-493a-a5ab-9f2a0eafa36f-host\") pod \"node-ca-j77rr\" (UID: \"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f\") " pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.979998 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5af6d22e-6f75-493a-a5ab-9f2a0eafa36f-serviceca\") pod \"node-ca-j77rr\" (UID: \"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f\") " pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980018 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-cni-bin\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.980424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980082 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-etc-kubernetes\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbz7p\" (UniqueName: \"kubernetes.io/projected/9f3be14a-c6d2-4e17-88f8-9129b465bd71-kube-api-access-pbz7p\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3a6b05e-7ccd-4812-b0a3-5860098b7618-cni-binary-copy\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-kubernetes\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980200 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-log-socket\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980316 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5af6d22e-6f75-493a-a5ab-9f2a0eafa36f-host\") pod \"node-ca-j77rr\" (UID: \"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f\") " pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980202 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-host\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980357 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-kubelet\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980365 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-system-cni-dir\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980447 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-cni-bin\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980242 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-host\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980492 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-kubelet\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980674 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f3be14a-c6d2-4e17-88f8-9129b465bd71-env-overrides\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980385 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-systemd-units\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980724 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3a6b05e-7ccd-4812-b0a3-5860098b7618-cni-binary-copy\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f3be14a-c6d2-4e17-88f8-9129b465bd71-ovnkube-script-lib\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980748 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-systemd-units\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.981214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980757 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-run-k8s-cni-cncf-io\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980799 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980801 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-kubernetes\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980814 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-run-systemd\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980827 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-os-release\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-var-lib-kubelet\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-os-release\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980902 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-modprobe-d\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980924 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-sysconfig\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-var-lib-kubelet\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980946 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f3be14a-c6d2-4e17-88f8-9129b465bd71-ovnkube-config\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980961 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980950 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-slash\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980982 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-slash\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-node-log\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.980948 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3a6b05e-7ccd-4812-b0a3-5860098b7618-os-release\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981011 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-sysconfig\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.982056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981028 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-daemon-config\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981029 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6h6d\" (UniqueName: \"kubernetes.io/projected/e3a6b05e-7ccd-4812-b0a3-5860098b7618-kube-api-access-j6h6d\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-modprobe-d\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981056 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-node-log\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981057 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-var-lib-cni-multus\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981090 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-var-lib-kubelet\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981090 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-var-lib-cni-multus\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-hostroot\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-tuned\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981187 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-os-release\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981200 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-hostroot\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7hhs\" (UniqueName: \"kubernetes.io/projected/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-kube-api-access-v7hhs\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981226 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5af6d22e-6f75-493a-a5ab-9f2a0eafa36f-serviceca\") pod \"node-ca-j77rr\" (UID: \"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f\") " pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981240 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-cni-netd\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981282 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981296 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-cni-netd\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981315 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-var-lib-cni-bin\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.982868 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981342 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-run-multus-certs\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-sysctl-conf\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981375 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-var-lib-cni-bin\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981375 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-run-multus-certs\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981418 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8dc6\" (UniqueName: \"kubernetes.io/projected/99b45f86-8fd1-4884-ab21-716f105f2d77-kube-api-access-z8dc6\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981442 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-sys\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-etc-openvswitch\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981496 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-run-ovn-kubernetes\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981506 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-sysctl-conf\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981509 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-sys\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e3a6b05e-7ccd-4812-b0a3-5860098b7618-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981554 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-etc-openvswitch\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f3be14a-c6d2-4e17-88f8-9129b465bd71-host-run-ovn-kubernetes\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981593 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-socket-dir-parent\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-multus-socket-dir-parent\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981144 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99b45f86-8fd1-4884-ab21-716f105f2d77-host-var-lib-kubelet\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.981947 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e3a6b05e-7ccd-4812-b0a3-5860098b7618-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.983536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.982847 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-tmp\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.983986 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.982890 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f3be14a-c6d2-4e17-88f8-9129b465bd71-ovn-node-metrics-cert\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.983986 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.983396 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-etc-tuned\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:33.988007 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.987982 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thlml\" (UniqueName: \"kubernetes.io/projected/5af6d22e-6f75-493a-a5ab-9f2a0eafa36f-kube-api-access-thlml\") pod \"node-ca-j77rr\" (UID: \"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f\") " pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:33.988398 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.988380 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbz7p\" (UniqueName: \"kubernetes.io/projected/9f3be14a-c6d2-4e17-88f8-9129b465bd71-kube-api-access-pbz7p\") pod \"ovnkube-node-r2zfd\" (UID: \"9f3be14a-c6d2-4e17-88f8-9129b465bd71\") " pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:33.988535 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.988519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6h6d\" (UniqueName: \"kubernetes.io/projected/e3a6b05e-7ccd-4812-b0a3-5860098b7618-kube-api-access-j6h6d\") pod \"multus-additional-cni-plugins-4npn9\" (UID: \"e3a6b05e-7ccd-4812-b0a3-5860098b7618\") " pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:33.988836 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.988822 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8dc6\" (UniqueName: \"kubernetes.io/projected/99b45f86-8fd1-4884-ab21-716f105f2d77-kube-api-access-z8dc6\") pod \"multus-snx6p\" (UID: \"99b45f86-8fd1-4884-ab21-716f105f2d77\") " pod="openshift-multus/multus-snx6p" Apr 20 22:24:33.989162 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:33.989131 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7hhs\" (UniqueName: \"kubernetes.io/projected/bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4-kube-api-access-v7hhs\") pod \"tuned-n2cxv\" (UID: \"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4\") " pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:34.091825 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.091789 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:34.098377 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.098351 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-shw5t" Apr 20 22:24:34.098758 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:34.098722 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd33395f3_9f3c_4562_ad4d_b1058d8551bf.slice/crio-15833368c049c30bb8045ce3895d0863b71bf0a1714693f3bd81ba7c36f0c4e9 WatchSource:0}: Error finding container 15833368c049c30bb8045ce3895d0863b71bf0a1714693f3bd81ba7c36f0c4e9: Status 404 returned error can't find the container with id 15833368c049c30bb8045ce3895d0863b71bf0a1714693f3bd81ba7c36f0c4e9 Apr 20 22:24:34.104568 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:34.104529 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffc7a641_db0a_4528_b993_5560e673b5d5.slice/crio-44300ec115e0410e6f2b2d6c1ef479fb5904091107cd2c7d46d7c65a90341c7c WatchSource:0}: Error finding container 44300ec115e0410e6f2b2d6c1ef479fb5904091107cd2c7d46d7c65a90341c7c: Status 404 returned error can't find the container with id 44300ec115e0410e6f2b2d6c1ef479fb5904091107cd2c7d46d7c65a90341c7c Apr 20 22:24:34.114197 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.114128 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6cs9w" Apr 20 22:24:34.120913 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:34.120880 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2a5c1d_a5d3_4341_8c92_2a050066670f.slice/crio-29b00977e09ed2564ce5b1a7535c74fcc337011bd46c6ce9b4d7182dfc15d75a WatchSource:0}: Error finding container 29b00977e09ed2564ce5b1a7535c74fcc337011bd46c6ce9b4d7182dfc15d75a: Status 404 returned error can't find the container with id 29b00977e09ed2564ce5b1a7535c74fcc337011bd46c6ce9b4d7182dfc15d75a Apr 20 22:24:34.127959 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.127936 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" Apr 20 22:24:34.135370 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:34.135344 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e30e4d9_9a90_421a_a38c_be8c9ff6fa9a.slice/crio-1edee7e1d1b91f0c4c8af2625ff5b9c74d21d1ed2bee0ff4e9561477995c7a91 WatchSource:0}: Error finding container 1edee7e1d1b91f0c4c8af2625ff5b9c74d21d1ed2bee0ff4e9561477995c7a91: Status 404 returned error can't find the container with id 1edee7e1d1b91f0c4c8af2625ff5b9c74d21d1ed2bee0ff4e9561477995c7a91 Apr 20 22:24:34.146987 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.146962 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" Apr 20 22:24:34.153380 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:34.153352 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf8b1d31_7a28_4d2a_bcd3_c9a620abbfc4.slice/crio-8c5671c61bb0e9cc1bdeca1a3972e508b966f1cb5e7a8428c12fabae505fa8c2 WatchSource:0}: Error finding container 8c5671c61bb0e9cc1bdeca1a3972e508b966f1cb5e7a8428c12fabae505fa8c2: Status 404 returned error can't find the container with id 8c5671c61bb0e9cc1bdeca1a3972e508b966f1cb5e7a8428c12fabae505fa8c2 Apr 20 22:24:34.162173 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.162137 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:34.168228 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.168206 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4npn9" Apr 20 22:24:34.170029 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:34.170003 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3be14a_c6d2_4e17_88f8_9129b465bd71.slice/crio-cfaece0d243ca1c01f3f681ca86b3152e30a71508bfd27eec56f316c0208f446 WatchSource:0}: Error finding container cfaece0d243ca1c01f3f681ca86b3152e30a71508bfd27eec56f316c0208f446: Status 404 returned error can't find the container with id cfaece0d243ca1c01f3f681ca86b3152e30a71508bfd27eec56f316c0208f446 Apr 20 22:24:34.175742 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:34.175712 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a6b05e_7ccd_4812_b0a3_5860098b7618.slice/crio-21a987bf45bf7c69b3038d3a250706106aeb9de6861be1fe0c4d1e6d1a17f906 WatchSource:0}: Error finding container 21a987bf45bf7c69b3038d3a250706106aeb9de6861be1fe0c4d1e6d1a17f906: Status 404 returned error can't find the container with id 21a987bf45bf7c69b3038d3a250706106aeb9de6861be1fe0c4d1e6d1a17f906 Apr 20 22:24:34.185094 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.185066 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-snx6p" Apr 20 22:24:34.192316 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:34.192287 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b45f86_8fd1_4884_ab21_716f105f2d77.slice/crio-73d655877b21790493e22877deb5dae3f58dfad29d49a03445564fb267570e9a WatchSource:0}: Error finding container 73d655877b21790493e22877deb5dae3f58dfad29d49a03445564fb267570e9a: Status 404 returned error can't find the container with id 73d655877b21790493e22877deb5dae3f58dfad29d49a03445564fb267570e9a Apr 20 22:24:34.193303 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.193287 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j77rr" Apr 20 22:24:34.199803 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:24:34.199781 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5af6d22e_6f75_493a_a5ab_9f2a0eafa36f.slice/crio-31adf4ad3ade6b79592c9c46cae9ba96cc9ac33ddcc9f0fafb1928663fb8342d WatchSource:0}: Error finding container 31adf4ad3ade6b79592c9c46cae9ba96cc9ac33ddcc9f0fafb1928663fb8342d: Status 404 returned error can't find the container with id 31adf4ad3ade6b79592c9c46cae9ba96cc9ac33ddcc9f0fafb1928663fb8342d Apr 20 22:24:34.385609 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.384962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:34.385609 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:34.385145 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:34.385609 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:34.385235 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs podName:a6b93d87-66d5-4f06-b428-6cbc7fcdeda2 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:35.385214551 +0000 UTC m=+3.058402221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs") pod "network-metrics-daemon-5kgfv" (UID: "a6b93d87-66d5-4f06-b428-6cbc7fcdeda2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:34.486102 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.485935 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:34.486102 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:34.486091 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:34.486102 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:34.486109 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:34.486360 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:34.486121 2568 projected.go:194] Error preparing data for projected volume kube-api-access-cz2jv for pod openshift-network-diagnostics/network-check-target-shrcq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:34.486360 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:34.486197 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv podName:a761dc14-770d-43e4-b87c-68589f057961 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:35.486176888 +0000 UTC m=+3.159364551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cz2jv" (UniqueName: "kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv") pod "network-check-target-shrcq" (UID: "a761dc14-770d-43e4-b87c-68589f057961") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:34.812208 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.812146 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 22:19:33 +0000 UTC" deadline="2027-10-13 14:09:26.807750054 +0000 UTC" Apr 20 22:24:34.812208 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.812210 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12975h44m51.995549824s" Apr 20 22:24:34.915330 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.914478 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:34.915330 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:34.914652 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:34.923041 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.922961 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j77rr" event={"ID":"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f","Type":"ContainerStarted","Data":"31adf4ad3ade6b79592c9c46cae9ba96cc9ac33ddcc9f0fafb1928663fb8342d"} Apr 20 22:24:34.932013 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.931965 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-snx6p" event={"ID":"99b45f86-8fd1-4884-ab21-716f105f2d77","Type":"ContainerStarted","Data":"73d655877b21790493e22877deb5dae3f58dfad29d49a03445564fb267570e9a"} Apr 20 22:24:34.948946 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.948897 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npn9" event={"ID":"e3a6b05e-7ccd-4812-b0a3-5860098b7618","Type":"ContainerStarted","Data":"21a987bf45bf7c69b3038d3a250706106aeb9de6861be1fe0c4d1e6d1a17f906"} Apr 20 22:24:34.961133 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.961088 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" event={"ID":"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4","Type":"ContainerStarted","Data":"8c5671c61bb0e9cc1bdeca1a3972e508b966f1cb5e7a8428c12fabae505fa8c2"} Apr 20 22:24:34.972119 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.972076 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6cs9w" event={"ID":"fa2a5c1d-a5d3-4341-8c92-2a050066670f","Type":"ContainerStarted","Data":"29b00977e09ed2564ce5b1a7535c74fcc337011bd46c6ce9b4d7182dfc15d75a"} Apr 20 22:24:34.994692 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:34.994653 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t89tm" event={"ID":"d33395f3-9f3c-4562-ad4d-b1058d8551bf","Type":"ContainerStarted","Data":"15833368c049c30bb8045ce3895d0863b71bf0a1714693f3bd81ba7c36f0c4e9"} Apr 20 22:24:35.011850 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.011818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" event={"ID":"9f3be14a-c6d2-4e17-88f8-9129b465bd71","Type":"ContainerStarted","Data":"cfaece0d243ca1c01f3f681ca86b3152e30a71508bfd27eec56f316c0208f446"} Apr 20 22:24:35.030344 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.030234 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" event={"ID":"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a","Type":"ContainerStarted","Data":"1edee7e1d1b91f0c4c8af2625ff5b9c74d21d1ed2bee0ff4e9561477995c7a91"} Apr 20 22:24:35.053124 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.053019 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-shw5t" event={"ID":"ffc7a641-db0a-4528-b993-5560e673b5d5","Type":"ContainerStarted","Data":"44300ec115e0410e6f2b2d6c1ef479fb5904091107cd2c7d46d7c65a90341c7c"} Apr 20 22:24:35.061566 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.061348 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:35.132949 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.132674 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:35.183081 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.183038 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:24:35.393455 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.393274 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:35.393614 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:35.393581 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:35.393668 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:35.393654 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs podName:a6b93d87-66d5-4f06-b428-6cbc7fcdeda2 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:37.393633524 +0000 UTC m=+5.066821203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs") pod "network-metrics-daemon-5kgfv" (UID: "a6b93d87-66d5-4f06-b428-6cbc7fcdeda2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:35.494295 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.494257 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:35.494488 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:35.494463 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:35.494488 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:35.494484 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:35.494603 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:35.494497 2568 projected.go:194] Error preparing data for projected volume kube-api-access-cz2jv for pod openshift-network-diagnostics/network-check-target-shrcq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:35.494603 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:35.494555 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv podName:a761dc14-770d-43e4-b87c-68589f057961 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:37.494535914 +0000 UTC m=+5.167723593 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cz2jv" (UniqueName: "kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv") pod "network-check-target-shrcq" (UID: "a761dc14-770d-43e4-b87c-68589f057961") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:35.813300 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.813252 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 22:19:33 +0000 UTC" deadline="2027-12-03 00:35:02.104170396 +0000 UTC" Apr 20 22:24:35.813300 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.813297 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14186h10m26.290877275s" Apr 20 22:24:35.909905 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:35.909265 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:35.909905 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:35.909410 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:36.911659 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:36.911616 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:36.912144 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:36.911747 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:37.411885 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:37.411707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:37.412100 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:37.411898 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:37.412100 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:37.411971 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs podName:a6b93d87-66d5-4f06-b428-6cbc7fcdeda2 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:41.411951355 +0000 UTC m=+9.085139023 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs") pod "network-metrics-daemon-5kgfv" (UID: "a6b93d87-66d5-4f06-b428-6cbc7fcdeda2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:37.512405 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:37.512310 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:37.512597 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:37.512464 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:37.512597 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:37.512493 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:37.512597 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:37.512507 2568 projected.go:194] Error preparing data for projected volume kube-api-access-cz2jv for pod openshift-network-diagnostics/network-check-target-shrcq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:37.512597 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:37.512570 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv podName:a761dc14-770d-43e4-b87c-68589f057961 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:41.512553619 +0000 UTC m=+9.185741294 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cz2jv" (UniqueName: "kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv") pod "network-check-target-shrcq" (UID: "a761dc14-770d-43e4-b87c-68589f057961") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:37.909308 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:37.909138 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:37.909308 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:37.909305 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:38.909232 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:38.909193 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:38.909713 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:38.909329 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:39.909411 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:39.909369 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:39.909875 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:39.909554 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:40.908684 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:40.908647 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:40.908899 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:40.908788 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:41.449203 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:41.449024 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:41.449203 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:41.449177 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:41.449717 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:41.449255 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs podName:a6b93d87-66d5-4f06-b428-6cbc7fcdeda2 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:49.449235417 +0000 UTC m=+17.122423106 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs") pod "network-metrics-daemon-5kgfv" (UID: "a6b93d87-66d5-4f06-b428-6cbc7fcdeda2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:41.549974 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:41.549931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:41.550133 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:41.550118 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:41.550208 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:41.550141 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:41.550208 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:41.550169 2568 projected.go:194] Error preparing data for projected volume kube-api-access-cz2jv for pod openshift-network-diagnostics/network-check-target-shrcq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:41.550295 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:41.550231 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv podName:a761dc14-770d-43e4-b87c-68589f057961 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:49.550211975 +0000 UTC m=+17.223399651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cz2jv" (UniqueName: "kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv") pod "network-check-target-shrcq" (UID: "a761dc14-770d-43e4-b87c-68589f057961") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:41.909367 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:41.909274 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:41.909535 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:41.909415 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:42.909820 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:42.909730 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:42.910287 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:42.909883 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:43.909190 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:43.909142 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:43.909342 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:43.909299 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:44.909128 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:44.909088 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:44.909573 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:44.909242 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:45.909252 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:45.909214 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:45.909717 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:45.909351 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:46.909168 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:46.909107 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:46.909367 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:46.909256 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:47.909363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:47.909081 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:47.909824 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:47.909480 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:48.908862 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:48.908823 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:48.909051 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:48.908965 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:49.511264 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:49.511229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:49.511713 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:49.511386 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:49.511713 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:49.511461 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs podName:a6b93d87-66d5-4f06-b428-6cbc7fcdeda2 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:05.511436963 +0000 UTC m=+33.184624638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs") pod "network-metrics-daemon-5kgfv" (UID: "a6b93d87-66d5-4f06-b428-6cbc7fcdeda2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:49.612470 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:49.612418 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:49.612658 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:49.612617 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:49.612658 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:49.612644 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:49.612658 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:49.612654 2568 projected.go:194] Error preparing data for projected volume kube-api-access-cz2jv for pod openshift-network-diagnostics/network-check-target-shrcq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:49.612825 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:49.612725 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv podName:a761dc14-770d-43e4-b87c-68589f057961 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:05.612705831 +0000 UTC m=+33.285893517 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cz2jv" (UniqueName: "kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv") pod "network-check-target-shrcq" (UID: "a761dc14-770d-43e4-b87c-68589f057961") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:49.909041 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:49.908959 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:49.909224 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:49.909090 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:50.908700 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:50.908659 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:50.909167 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:50.908779 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:51.908693 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:51.908673 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:51.908793 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:51.908777 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:52.103117 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:52.103081 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal" event={"ID":"674e8501a0d803973607282eda51f055","Type":"ContainerStarted","Data":"0d69e100d3fffc9226bd1a8527b151508ceb804e011ad0e6d5ecb1785c4fcc36"} Apr 20 22:24:52.117228 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:52.117179 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-201.ec2.internal" podStartSLOduration=19.117144764 podStartE2EDuration="19.117144764s" podCreationTimestamp="2026-04-20 22:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:52.116740901 +0000 UTC m=+19.789928582" watchObservedRunningTime="2026-04-20 22:24:52.117144764 +0000 UTC m=+19.790332450" Apr 20 22:24:52.910055 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:52.909629 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:52.910816 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:52.910179 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:53.109833 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.108970 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-snx6p" event={"ID":"99b45f86-8fd1-4884-ab21-716f105f2d77","Type":"ContainerStarted","Data":"1c3ab622ad2b3e83c21e855765e10d33b0a955ce0bd84433c61a349bc3cf72c8"} Apr 20 22:24:53.114758 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.113912 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" event={"ID":"bf8b1d31-7a28-4d2a-bcd3-c9a620abbfc4","Type":"ContainerStarted","Data":"f66f1895ff2d77101c597c2c1e13a929c6c82135649abddc0b80100e6929751a"} Apr 20 22:24:53.128624 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.128404 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" event={"ID":"9f3be14a-c6d2-4e17-88f8-9129b465bd71","Type":"ContainerStarted","Data":"db2460da33d7d690be9d5d8a08987da5668f8ce6160616edbfa2860509faf0b6"} Apr 20 22:24:53.128773 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.128633 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" event={"ID":"9f3be14a-c6d2-4e17-88f8-9129b465bd71","Type":"ContainerStarted","Data":"1398ab7e508df9c58aba48c652b6b3fa8134735600dede8f0c6f3f3248ed6a1a"} Apr 20 22:24:53.128773 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.128650 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" event={"ID":"9f3be14a-c6d2-4e17-88f8-9129b465bd71","Type":"ContainerStarted","Data":"261f05057cfdc91ca98968c8a1d852c1b85dae5f1e39a77411dc86cf1b4ae1b3"} Apr 20 22:24:53.128773 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.128660 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" event={"ID":"9f3be14a-c6d2-4e17-88f8-9129b465bd71","Type":"ContainerStarted","Data":"e4d669238f0f64461975be74e58fc5cd14f95808c78504995f4b15ed87b086f4"} Apr 20 22:24:53.128773 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.128668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" event={"ID":"9f3be14a-c6d2-4e17-88f8-9129b465bd71","Type":"ContainerStarted","Data":"fbbb57c9faa4488438186fb4732def0aa286ebdc2f0270e0f816f27edc67b7d2"} Apr 20 22:24:53.133423 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.133366 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-snx6p" podStartSLOduration=2.266493384 podStartE2EDuration="20.13333136s" podCreationTimestamp="2026-04-20 22:24:33 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.194174678 +0000 UTC m=+1.867362346" lastFinishedPulling="2026-04-20 22:24:52.061012655 +0000 UTC m=+19.734200322" observedRunningTime="2026-04-20 22:24:53.1326314 +0000 UTC m=+20.805819098" watchObservedRunningTime="2026-04-20 22:24:53.13333136 +0000 UTC m=+20.806519045" Apr 20 22:24:53.163171 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.163098 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-t89tm" podStartSLOduration=7.687466612 podStartE2EDuration="21.163082207s" podCreationTimestamp="2026-04-20 22:24:32 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.101475933 +0000 UTC m=+1.774663596" lastFinishedPulling="2026-04-20 22:24:47.577091525 +0000 UTC m=+15.250279191" observedRunningTime="2026-04-20 22:24:53.162749808 +0000 UTC m=+20.835937498" watchObservedRunningTime="2026-04-20 22:24:53.163082207 +0000 UTC m=+20.836269891" Apr 20 22:24:53.163341 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.163235 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-n2cxv" podStartSLOduration=2.442518706 podStartE2EDuration="20.163228336s" podCreationTimestamp="2026-04-20 22:24:33 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.154838184 +0000 UTC m=+1.828025850" lastFinishedPulling="2026-04-20 22:24:51.875547814 +0000 UTC m=+19.548735480" observedRunningTime="2026-04-20 22:24:53.147359566 +0000 UTC m=+20.820547252" watchObservedRunningTime="2026-04-20 22:24:53.163228336 +0000 UTC m=+20.836416022" Apr 20 22:24:53.908896 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.908869 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:53.909017 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:53.908976 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:53.940984 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:53.940953 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 22:24:54.134049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.133948 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6cs9w" event={"ID":"fa2a5c1d-a5d3-4341-8c92-2a050066670f","Type":"ContainerStarted","Data":"3bb7d202f4b7ad03bf411e8891a2c41e773e6324f100b8ebae7d6928848cb56b"} Apr 20 22:24:54.135261 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.135233 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t89tm" event={"ID":"d33395f3-9f3c-4562-ad4d-b1058d8551bf","Type":"ContainerStarted","Data":"8ad71c695393719b404857d5fcbdf5a39b1c62f673db3031de6bda02e9ef7a2c"} Apr 20 22:24:54.136464 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.136440 2568 generic.go:358] "Generic (PLEG): container finished" podID="f5a116cbed3e163e10f394a1de03350e" containerID="74aa109ab6014b9b6190bf20517ec16bb98aef027396a749030bcea3bfd05be6" exitCode=0 Apr 20 22:24:54.136585 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.136519 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" event={"ID":"f5a116cbed3e163e10f394a1de03350e","Type":"ContainerDied","Data":"74aa109ab6014b9b6190bf20517ec16bb98aef027396a749030bcea3bfd05be6"} Apr 20 22:24:54.139305 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.139280 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" event={"ID":"9f3be14a-c6d2-4e17-88f8-9129b465bd71","Type":"ContainerStarted","Data":"ef951c4fef43dce1da176dccb574ea1f77fe8ddfc6dbb475b0265c205f94deed"} Apr 20 22:24:54.140880 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.140858 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" event={"ID":"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a","Type":"ContainerStarted","Data":"336ea889b546990f98c873a513793d8dbe5743685cecfef550b41b855d9cad85"} Apr 20 22:24:54.140986 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.140886 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" event={"ID":"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a","Type":"ContainerStarted","Data":"afc28cf83b93e1dd77d78c14afe664fd650b7ce5492693f0194049ba4e31bb96"} Apr 20 22:24:54.142196 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.142170 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-shw5t" event={"ID":"ffc7a641-db0a-4528-b993-5560e673b5d5","Type":"ContainerStarted","Data":"563104a1aa0dc557548ff9e8ce67e035136899a72da14a725d94fb7c21a1be77"} Apr 20 22:24:54.143412 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.143389 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j77rr" event={"ID":"5af6d22e-6f75-493a-a5ab-9f2a0eafa36f","Type":"ContainerStarted","Data":"3afd1bb4530cfd9e533b9256f8ef39229bdd1cc8aa1ac9e7c9f687ee0b363c45"} Apr 20 22:24:54.144686 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.144660 2568 generic.go:358] "Generic (PLEG): container finished" podID="e3a6b05e-7ccd-4812-b0a3-5860098b7618" containerID="574a2917bfa9aedbe633245877a23b2abce00886bf2a8f7990756b56d9894029" exitCode=0 Apr 20 22:24:54.144781 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.144736 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npn9" event={"ID":"e3a6b05e-7ccd-4812-b0a3-5860098b7618","Type":"ContainerDied","Data":"574a2917bfa9aedbe633245877a23b2abce00886bf2a8f7990756b56d9894029"} Apr 20 22:24:54.162265 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.162212 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6cs9w" podStartSLOduration=4.412841376 podStartE2EDuration="22.162194145s" podCreationTimestamp="2026-04-20 22:24:32 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.122517489 +0000 UTC m=+1.795705155" lastFinishedPulling="2026-04-20 22:24:51.871870246 +0000 UTC m=+19.545057924" observedRunningTime="2026-04-20 22:24:54.149280235 +0000 UTC m=+21.822467921" watchObservedRunningTime="2026-04-20 22:24:54.162194145 +0000 UTC m=+21.835381856" Apr 20 22:24:54.179698 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.179636 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-shw5t" podStartSLOduration=4.427889286 podStartE2EDuration="22.179616892s" podCreationTimestamp="2026-04-20 22:24:32 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.106127262 +0000 UTC m=+1.779314925" lastFinishedPulling="2026-04-20 22:24:51.857854852 +0000 UTC m=+19.531042531" observedRunningTime="2026-04-20 22:24:54.162141466 +0000 UTC m=+21.835329152" watchObservedRunningTime="2026-04-20 22:24:54.179616892 +0000 UTC m=+21.852804577" Apr 20 22:24:54.208920 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.208852 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j77rr" podStartSLOduration=3.538518464 podStartE2EDuration="21.208831106s" podCreationTimestamp="2026-04-20 22:24:33 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.201327547 +0000 UTC m=+1.874515214" lastFinishedPulling="2026-04-20 22:24:51.871640178 +0000 UTC m=+19.544827856" observedRunningTime="2026-04-20 22:24:54.208033578 +0000 UTC m=+21.881221261" watchObservedRunningTime="2026-04-20 22:24:54.208831106 +0000 UTC m=+21.882018791" Apr 20 22:24:54.850607 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.850478 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T22:24:53.940975981Z","UUID":"90fd9e8e-3416-429a-8717-89af80c3ebc4","Handler":null,"Name":"","Endpoint":""} Apr 20 22:24:54.852362 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.852336 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 22:24:54.852362 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.852370 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 22:24:54.909255 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:54.909166 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:54.909421 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:54.909302 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:55.148720 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:55.148678 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" event={"ID":"f5a116cbed3e163e10f394a1de03350e","Type":"ContainerStarted","Data":"298b78c056ff5b99dbd52fafd0f1c7510f73ef812ac28c425ed48ceb48d1dbf8"} Apr 20 22:24:55.151088 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:55.151051 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" event={"ID":"9e30e4d9-9a90-421a-a38c-be8c9ff6fa9a","Type":"ContainerStarted","Data":"f35b02f4a21b0fbc1a924deab9597edf0f5fc1b716246918964240001bc02a82"} Apr 20 22:24:55.181470 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:55.181418 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-201.ec2.internal" podStartSLOduration=22.181398277 podStartE2EDuration="22.181398277s" podCreationTimestamp="2026-04-20 22:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:55.180680466 +0000 UTC m=+22.853868152" watchObservedRunningTime="2026-04-20 22:24:55.181398277 +0000 UTC m=+22.854585964" Apr 20 22:24:55.206759 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:55.206713 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5vfbk" podStartSLOduration=1.7004930649999999 podStartE2EDuration="22.206697159s" podCreationTimestamp="2026-04-20 22:24:33 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.137058695 +0000 UTC m=+1.810246362" lastFinishedPulling="2026-04-20 22:24:54.643262779 +0000 UTC m=+22.316450456" observedRunningTime="2026-04-20 22:24:55.206593435 +0000 UTC m=+22.879781124" watchObservedRunningTime="2026-04-20 22:24:55.206697159 +0000 UTC m=+22.879884844" Apr 20 22:24:55.381347 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:55.381304 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:55.381915 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:55.381888 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:55.908586 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:55.908550 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:55.908766 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:55.908676 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:56.159140 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:56.159044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" event={"ID":"9f3be14a-c6d2-4e17-88f8-9129b465bd71","Type":"ContainerStarted","Data":"c6039233d962a22ab29fae3c49c1600abe6c1d514e597e5fea8755272de5e008"} Apr 20 22:24:56.159663 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:56.159645 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:56.159984 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:56.159966 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-t89tm" Apr 20 22:24:56.909134 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:56.909084 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:56.909310 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:56.909239 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:57.908948 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:57.908916 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:57.909372 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:57.909059 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:24:58.908735 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:58.908558 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:24:58.908864 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:58.908840 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:24:59.168233 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:59.168112 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" event={"ID":"9f3be14a-c6d2-4e17-88f8-9129b465bd71","Type":"ContainerStarted","Data":"21eea22ce177cfc486645c67114d4b8739ee4bef53a61bac9f085181502bd951"} Apr 20 22:24:59.169172 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:59.168452 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:59.169172 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:59.168471 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:59.169888 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:59.169862 2568 generic.go:358] "Generic (PLEG): container finished" podID="e3a6b05e-7ccd-4812-b0a3-5860098b7618" containerID="1773e1f9a00ca30e5acc7a6c1547e02ea94e2c526a2969864280f031b7d840b6" exitCode=0 Apr 20 22:24:59.169995 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:59.169901 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npn9" event={"ID":"e3a6b05e-7ccd-4812-b0a3-5860098b7618","Type":"ContainerDied","Data":"1773e1f9a00ca30e5acc7a6c1547e02ea94e2c526a2969864280f031b7d840b6"} Apr 20 22:24:59.184252 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:59.184223 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:24:59.195137 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:59.195092 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" podStartSLOduration=7.8975928 podStartE2EDuration="26.195077661s" podCreationTimestamp="2026-04-20 22:24:33 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.171792319 +0000 UTC m=+1.844979981" lastFinishedPulling="2026-04-20 22:24:52.469277161 +0000 UTC m=+20.142464842" observedRunningTime="2026-04-20 22:24:59.19466308 +0000 UTC m=+26.867850788" watchObservedRunningTime="2026-04-20 22:24:59.195077661 +0000 UTC m=+26.868265343" Apr 20 22:24:59.908617 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:24:59.908434 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:24:59.908792 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:24:59.908703 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:25:00.173936 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:00.173815 2568 generic.go:358] "Generic (PLEG): container finished" podID="e3a6b05e-7ccd-4812-b0a3-5860098b7618" containerID="cf27d1f6e2d87b05ede73153779f1a34fa619568d71524d488240ac1dc41bb5e" exitCode=0 Apr 20 22:25:00.173936 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:00.173897 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npn9" event={"ID":"e3a6b05e-7ccd-4812-b0a3-5860098b7618","Type":"ContainerDied","Data":"cf27d1f6e2d87b05ede73153779f1a34fa619568d71524d488240ac1dc41bb5e"} Apr 20 22:25:00.174450 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:00.174428 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:25:00.190062 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:00.190033 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:25:00.306193 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:00.306138 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5kgfv"] Apr 20 22:25:00.306376 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:00.306295 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:25:00.306433 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:00.306420 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:25:00.306662 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:00.306638 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-shrcq"] Apr 20 22:25:00.306775 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:00.306762 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:25:00.306891 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:00.306869 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:25:01.179908 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:01.178558 2568 generic.go:358] "Generic (PLEG): container finished" podID="e3a6b05e-7ccd-4812-b0a3-5860098b7618" containerID="3789dbcd5067d4251ff3f79ba5076466ba32792c772467c1db488e6c5633af87" exitCode=0 Apr 20 22:25:01.179908 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:01.179876 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npn9" event={"ID":"e3a6b05e-7ccd-4812-b0a3-5860098b7618","Type":"ContainerDied","Data":"3789dbcd5067d4251ff3f79ba5076466ba32792c772467c1db488e6c5633af87"} Apr 20 22:25:01.909403 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:01.909369 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:25:01.909403 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:01.909393 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:25:01.909650 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:01.909510 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:25:01.909650 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:01.909628 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:25:03.908909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:03.908877 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:25:03.909376 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:03.908878 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:25:03.909376 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:03.909026 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrcq" podUID="a761dc14-770d-43e4-b87c-68589f057961" Apr 20 22:25:03.909376 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:03.909097 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5kgfv" podUID="a6b93d87-66d5-4f06-b428-6cbc7fcdeda2" Apr 20 22:25:05.529420 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.529381 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:25:05.529826 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:05.529564 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:25:05.529826 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:05.529648 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs podName:a6b93d87-66d5-4f06-b428-6cbc7fcdeda2 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:37.529626308 +0000 UTC m=+65.202813978 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs") pod "network-metrics-daemon-5kgfv" (UID: "a6b93d87-66d5-4f06-b428-6cbc7fcdeda2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:25:05.630403 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.630354 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:25:05.630581 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:05.630551 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:25:05.630638 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:05.630581 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:25:05.630638 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:05.630594 2568 projected.go:194] Error preparing data for projected volume kube-api-access-cz2jv for pod openshift-network-diagnostics/network-check-target-shrcq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:25:05.630740 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:05.630662 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv podName:a761dc14-770d-43e4-b87c-68589f057961 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:37.630645288 +0000 UTC m=+65.303832950 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cz2jv" (UniqueName: "kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv") pod "network-check-target-shrcq" (UID: "a761dc14-770d-43e4-b87c-68589f057961") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:25:05.658001 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.657972 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-201.ec2.internal" event="NodeReady" Apr 20 22:25:05.658173 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.658123 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 22:25:05.701341 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.701292 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5756b"] Apr 20 22:25:05.724164 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.724113 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-khz6t"] Apr 20 22:25:05.724366 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.724332 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5756b" Apr 20 22:25:05.727239 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.727177 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 22:25:05.727400 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.727324 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 22:25:05.727400 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.727344 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv5w5\"" Apr 20 22:25:05.727736 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.727718 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 22:25:05.738356 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.738327 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5756b"] Apr 20 22:25:05.738356 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.738359 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-khz6t"] Apr 20 22:25:05.738553 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.738369 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-57zvt"] Apr 20 22:25:05.738553 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.738440 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.741380 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.741351 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rsxbh\"" Apr 20 22:25:05.741380 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.741376 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 22:25:05.741584 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.741351 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 22:25:05.741584 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.741548 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 22:25:05.741680 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.741669 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 22:25:05.758583 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.758543 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-57zvt"] Apr 20 22:25:05.758747 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.758729 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.761141 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.761113 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bl4mg\"" Apr 20 22:25:05.761303 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.761114 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 22:25:05.761303 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.761202 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 22:25:05.832648 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.832563 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/09fe9e71-2821-4229-a355-e118b4e9f593-crio-socket\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.832648 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.832610 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/09fe9e71-2821-4229-a355-e118b4e9f593-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.832876 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.832645 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e78fef-7128-47b2-a77d-46a98bb24af9-cert\") pod \"ingress-canary-5756b\" (UID: \"c1e78fef-7128-47b2-a77d-46a98bb24af9\") " pod="openshift-ingress-canary/ingress-canary-5756b" Apr 20 22:25:05.832876 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.832740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l675v\" (UniqueName: \"kubernetes.io/projected/c1e78fef-7128-47b2-a77d-46a98bb24af9-kube-api-access-l675v\") pod \"ingress-canary-5756b\" (UID: \"c1e78fef-7128-47b2-a77d-46a98bb24af9\") " pod="openshift-ingress-canary/ingress-canary-5756b" Apr 20 22:25:05.832876 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.832785 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/09fe9e71-2821-4229-a355-e118b4e9f593-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.832981 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.832868 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98rjm\" (UniqueName: \"kubernetes.io/projected/09fe9e71-2821-4229-a355-e118b4e9f593-kube-api-access-98rjm\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.832981 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.832926 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/09fe9e71-2821-4229-a355-e118b4e9f593-data-volume\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.832981 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.832968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pk9w\" (UniqueName: \"kubernetes.io/projected/d07f11ad-2096-40a1-9534-a3146fa93510-kube-api-access-2pk9w\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.833143 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.833052 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d07f11ad-2096-40a1-9534-a3146fa93510-config-volume\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.833143 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.833105 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07f11ad-2096-40a1-9534-a3146fa93510-metrics-tls\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.833258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.833168 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d07f11ad-2096-40a1-9534-a3146fa93510-tmp-dir\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.909304 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.909268 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:25:05.909513 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.909490 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:25:05.912895 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.912866 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-66fzs\"" Apr 20 22:25:05.912895 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.912885 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 22:25:05.913099 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.912871 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddtmq\"" Apr 20 22:25:05.913099 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.912877 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 22:25:05.913200 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.912866 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 22:25:05.933875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.933834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d07f11ad-2096-40a1-9534-a3146fa93510-config-volume\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.934074 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.933982 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07f11ad-2096-40a1-9534-a3146fa93510-metrics-tls\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.934074 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934027 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d07f11ad-2096-40a1-9534-a3146fa93510-tmp-dir\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.934074 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934060 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/09fe9e71-2821-4229-a355-e118b4e9f593-crio-socket\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.934253 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934087 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/09fe9e71-2821-4229-a355-e118b4e9f593-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.934253 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e78fef-7128-47b2-a77d-46a98bb24af9-cert\") pod \"ingress-canary-5756b\" (UID: \"c1e78fef-7128-47b2-a77d-46a98bb24af9\") " pod="openshift-ingress-canary/ingress-canary-5756b" Apr 20 22:25:05.934253 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934145 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l675v\" (UniqueName: \"kubernetes.io/projected/c1e78fef-7128-47b2-a77d-46a98bb24af9-kube-api-access-l675v\") pod \"ingress-canary-5756b\" (UID: \"c1e78fef-7128-47b2-a77d-46a98bb24af9\") " pod="openshift-ingress-canary/ingress-canary-5756b" Apr 20 22:25:05.934253 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934187 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/09fe9e71-2821-4229-a355-e118b4e9f593-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.934253 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934217 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98rjm\" (UniqueName: \"kubernetes.io/projected/09fe9e71-2821-4229-a355-e118b4e9f593-kube-api-access-98rjm\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.934484 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934292 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/09fe9e71-2821-4229-a355-e118b4e9f593-data-volume\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.934484 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pk9w\" (UniqueName: \"kubernetes.io/projected/d07f11ad-2096-40a1-9534-a3146fa93510-kube-api-access-2pk9w\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.935181 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934664 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d07f11ad-2096-40a1-9534-a3146fa93510-config-volume\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.935181 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d07f11ad-2096-40a1-9534-a3146fa93510-tmp-dir\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.935181 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.934896 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/09fe9e71-2821-4229-a355-e118b4e9f593-crio-socket\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.935181 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.935104 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/09fe9e71-2821-4229-a355-e118b4e9f593-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.935181 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.935126 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/09fe9e71-2821-4229-a355-e118b4e9f593-data-volume\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.938963 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.938932 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d07f11ad-2096-40a1-9534-a3146fa93510-metrics-tls\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.939112 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.938939 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/09fe9e71-2821-4229-a355-e118b4e9f593-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.939112 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.939029 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c1e78fef-7128-47b2-a77d-46a98bb24af9-cert\") pod \"ingress-canary-5756b\" (UID: \"c1e78fef-7128-47b2-a77d-46a98bb24af9\") " pod="openshift-ingress-canary/ingress-canary-5756b" Apr 20 22:25:05.943420 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.943391 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pk9w\" (UniqueName: \"kubernetes.io/projected/d07f11ad-2096-40a1-9534-a3146fa93510-kube-api-access-2pk9w\") pod \"dns-default-57zvt\" (UID: \"d07f11ad-2096-40a1-9534-a3146fa93510\") " pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:05.943420 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.943410 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98rjm\" (UniqueName: \"kubernetes.io/projected/09fe9e71-2821-4229-a355-e118b4e9f593-kube-api-access-98rjm\") pod \"insights-runtime-extractor-khz6t\" (UID: \"09fe9e71-2821-4229-a355-e118b4e9f593\") " pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:05.943622 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:05.943490 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l675v\" (UniqueName: \"kubernetes.io/projected/c1e78fef-7128-47b2-a77d-46a98bb24af9-kube-api-access-l675v\") pod \"ingress-canary-5756b\" (UID: \"c1e78fef-7128-47b2-a77d-46a98bb24af9\") " pod="openshift-ingress-canary/ingress-canary-5756b" Apr 20 22:25:06.036775 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.036743 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5756b" Apr 20 22:25:06.050603 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.050562 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-khz6t" Apr 20 22:25:06.069307 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.069275 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:06.424634 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.424590 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw"] Apr 20 22:25:06.445480 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.445446 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw"] Apr 20 22:25:06.445664 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.445619 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.449571 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.449545 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-tfnhf\"" Apr 20 22:25:06.449831 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.449815 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 22:25:06.450549 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.450523 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 22:25:06.451513 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.451459 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 22:25:06.451513 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.451486 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 22:25:06.451663 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.451459 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 22:25:06.452038 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.452018 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f2prt"] Apr 20 22:25:06.469511 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.469420 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f2prt"] Apr 20 22:25:06.469686 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.469591 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.473566 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.473530 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 22:25:06.473727 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.473539 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-5hcbj\"" Apr 20 22:25:06.473727 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.473540 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 22:25:06.473838 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.473542 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 22:25:06.483659 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.483621 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6jbt7"] Apr 20 22:25:06.499763 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.499737 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.502478 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.502449 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 22:25:06.502478 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.502468 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5mrl8\"" Apr 20 22:25:06.502920 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.502793 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 22:25:06.502920 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.502868 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 22:25:06.539472 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539438 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bae61085-f01a-4979-8495-49df502b51b9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.539472 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539478 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.540006 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539570 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bae61085-f01a-4979-8495-49df502b51b9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.540006 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539607 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.540006 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539696 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.540006 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539752 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ac8a623-4817-4b58-9c3f-57dce933db29-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.540006 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539788 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ac8a623-4817-4b58-9c3f-57dce933db29-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.540006 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539846 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ac8a623-4817-4b58-9c3f-57dce933db29-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.540006 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539872 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7r6f\" (UniqueName: \"kubernetes.io/projected/bae61085-f01a-4979-8495-49df502b51b9-kube-api-access-g7r6f\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.540006 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.539899 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d8r2\" (UniqueName: \"kubernetes.io/projected/5ac8a623-4817-4b58-9c3f-57dce933db29-kube-api-access-9d8r2\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.640389 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640342 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-textfile\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.640590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ac8a623-4817-4b58-9c3f-57dce933db29-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.640590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640420 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7r6f\" (UniqueName: \"kubernetes.io/projected/bae61085-f01a-4979-8495-49df502b51b9-kube-api-access-g7r6f\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.640590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640485 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-accelerators-collector-config\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.640590 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640546 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.640800 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640669 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/84a62217-01ac-4867-83c4-e5586c70021c-root\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.640800 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640711 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-wtmp\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.640800 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640744 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-tls\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.640800 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640772 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84a62217-01ac-4867-83c4-e5586c70021c-metrics-client-ca\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.641020 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640808 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bae61085-f01a-4979-8495-49df502b51b9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.641020 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.641020 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.641020 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640896 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ac8a623-4817-4b58-9c3f-57dce933db29-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.641020 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84a62217-01ac-4867-83c4-e5586c70021c-sys\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.641020 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640950 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ac8a623-4817-4b58-9c3f-57dce933db29-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.641020 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.640991 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9d8r2\" (UniqueName: \"kubernetes.io/projected/5ac8a623-4817-4b58-9c3f-57dce933db29-kube-api-access-9d8r2\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.641367 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.641024 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hsl\" (UniqueName: \"kubernetes.io/projected/84a62217-01ac-4867-83c4-e5586c70021c-kube-api-access-m2hsl\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.641367 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.641054 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bae61085-f01a-4979-8495-49df502b51b9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.641367 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.641081 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.641367 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.641220 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bae61085-f01a-4979-8495-49df502b51b9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.641367 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:06.641281 2568 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 20 22:25:06.641367 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.641301 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ac8a623-4817-4b58-9c3f-57dce933db29-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.641650 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:06.641544 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-tls podName:bae61085-f01a-4979-8495-49df502b51b9 nodeName:}" failed. No retries permitted until 2026-04-20 22:25:07.141523725 +0000 UTC m=+34.814711391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-f2prt" (UID: "bae61085-f01a-4979-8495-49df502b51b9") : secret "kube-state-metrics-tls" not found Apr 20 22:25:06.641900 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.641875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.643544 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.643519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.644168 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.644134 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ac8a623-4817-4b58-9c3f-57dce933db29-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.644257 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.644197 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ac8a623-4817-4b58-9c3f-57dce933db29-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.652897 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.652863 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7r6f\" (UniqueName: \"kubernetes.io/projected/bae61085-f01a-4979-8495-49df502b51b9-kube-api-access-g7r6f\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.654063 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.654035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d8r2\" (UniqueName: \"kubernetes.io/projected/5ac8a623-4817-4b58-9c3f-57dce933db29-kube-api-access-9d8r2\") pod \"openshift-state-metrics-9d44df66c-8kxkw\" (UID: \"5ac8a623-4817-4b58-9c3f-57dce933db29\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.667795 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.667748 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bae61085-f01a-4979-8495-49df502b51b9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:06.741518 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84a62217-01ac-4867-83c4-e5586c70021c-sys\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.741518 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hsl\" (UniqueName: \"kubernetes.io/projected/84a62217-01ac-4867-83c4-e5586c70021c-kube-api-access-m2hsl\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.741766 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741551 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.741766 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84a62217-01ac-4867-83c4-e5586c70021c-sys\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.741766 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741615 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-textfile\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.741766 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741694 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-accelerators-collector-config\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.741766 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741737 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/84a62217-01ac-4867-83c4-e5586c70021c-root\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.742032 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741769 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-wtmp\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.742032 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741816 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/84a62217-01ac-4867-83c4-e5586c70021c-root\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.742032 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-tls\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.742032 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741936 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84a62217-01ac-4867-83c4-e5586c70021c-metrics-client-ca\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.742032 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.741952 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-textfile\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.742032 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.742019 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-wtmp\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.744407 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.744377 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.744407 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.744395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-tls\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.750267 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.750232 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/84a62217-01ac-4867-83c4-e5586c70021c-node-exporter-accelerators-collector-config\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.750438 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.750297 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84a62217-01ac-4867-83c4-e5586c70021c-metrics-client-ca\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.758258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.758224 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" Apr 20 22:25:06.758587 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.758554 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hsl\" (UniqueName: \"kubernetes.io/projected/84a62217-01ac-4867-83c4-e5586c70021c-kube-api-access-m2hsl\") pod \"node-exporter-6jbt7\" (UID: \"84a62217-01ac-4867-83c4-e5586c70021c\") " pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:06.810246 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:06.810202 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6jbt7" Apr 20 22:25:07.007228 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:07.007188 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a62217_01ac_4867_83c4_e5586c70021c.slice/crio-5fdfbe8ab18f17b0e283afe90a3caa843e4d3a4a2344100492148a157163cf0f WatchSource:0}: Error finding container 5fdfbe8ab18f17b0e283afe90a3caa843e4d3a4a2344100492148a157163cf0f: Status 404 returned error can't find the container with id 5fdfbe8ab18f17b0e283afe90a3caa843e4d3a4a2344100492148a157163cf0f Apr 20 22:25:07.145496 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:07.145452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:07.154594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:07.154563 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bae61085-f01a-4979-8495-49df502b51b9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f2prt\" (UID: \"bae61085-f01a-4979-8495-49df502b51b9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:07.192522 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:07.192473 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6jbt7" event={"ID":"84a62217-01ac-4867-83c4-e5586c70021c","Type":"ContainerStarted","Data":"5fdfbe8ab18f17b0e283afe90a3caa843e4d3a4a2344100492148a157163cf0f"} Apr 20 22:25:07.198528 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:07.198477 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5756b"] Apr 20 22:25:07.208043 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:07.207994 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-khz6t"] Apr 20 22:25:07.211025 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:07.210987 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw"] Apr 20 22:25:07.223355 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:07.223326 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-57zvt"] Apr 20 22:25:07.255041 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:07.254962 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1e78fef_7128_47b2_a77d_46a98bb24af9.slice/crio-5d4d492a2e8ba8885ff71e18a5054e67192791441f8337222b958e2201f1ca41 WatchSource:0}: Error finding container 5d4d492a2e8ba8885ff71e18a5054e67192791441f8337222b958e2201f1ca41: Status 404 returned error can't find the container with id 5d4d492a2e8ba8885ff71e18a5054e67192791441f8337222b958e2201f1ca41 Apr 20 22:25:07.257463 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:07.257436 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fe9e71_2821_4229_a355_e118b4e9f593.slice/crio-0ee2634e57525881a203bbbd523675fda4fba5cfb6c50c140af4a6fee91b276a WatchSource:0}: Error finding container 0ee2634e57525881a203bbbd523675fda4fba5cfb6c50c140af4a6fee91b276a: Status 404 returned error can't find the container with id 0ee2634e57525881a203bbbd523675fda4fba5cfb6c50c140af4a6fee91b276a Apr 20 22:25:07.258942 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:07.258921 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ac8a623_4817_4b58_9c3f_57dce933db29.slice/crio-64f5adc60c372e28e2ed795b7e3e7beae3d9827166304a590c314772eeaa31b2 WatchSource:0}: Error finding container 64f5adc60c372e28e2ed795b7e3e7beae3d9827166304a590c314772eeaa31b2: Status 404 returned error can't find the container with id 64f5adc60c372e28e2ed795b7e3e7beae3d9827166304a590c314772eeaa31b2 Apr 20 22:25:07.259327 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:07.259306 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd07f11ad_2096_40a1_9534_a3146fa93510.slice/crio-e4cdacd6c0c658c5c16f45c415ee72bb4103793da48de11c572c5d297d8876d9 WatchSource:0}: Error finding container e4cdacd6c0c658c5c16f45c415ee72bb4103793da48de11c572c5d297d8876d9: Status 404 returned error can't find the container with id e4cdacd6c0c658c5c16f45c415ee72bb4103793da48de11c572c5d297d8876d9 Apr 20 22:25:07.381908 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:07.381653 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" Apr 20 22:25:07.538428 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:07.538384 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f2prt"] Apr 20 22:25:07.542641 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:07.542603 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbae61085_f01a_4979_8495_49df502b51b9.slice/crio-98d15540ba87dbb8ca101f920fe53ba008801add2a1b3c3ab938654ef906d8e7 WatchSource:0}: Error finding container 98d15540ba87dbb8ca101f920fe53ba008801add2a1b3c3ab938654ef906d8e7: Status 404 returned error can't find the container with id 98d15540ba87dbb8ca101f920fe53ba008801add2a1b3c3ab938654ef906d8e7 Apr 20 22:25:08.196248 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.196214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-57zvt" event={"ID":"d07f11ad-2096-40a1-9534-a3146fa93510","Type":"ContainerStarted","Data":"e4cdacd6c0c658c5c16f45c415ee72bb4103793da48de11c572c5d297d8876d9"} Apr 20 22:25:08.198348 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.198311 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" event={"ID":"5ac8a623-4817-4b58-9c3f-57dce933db29","Type":"ContainerStarted","Data":"c6dbfd552836b04b05c6a3e57420cf8e8e277a5691d413fe49c1b009cc56af32"} Apr 20 22:25:08.198502 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.198356 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" event={"ID":"5ac8a623-4817-4b58-9c3f-57dce933db29","Type":"ContainerStarted","Data":"f7dc6a552b5a127dd07e9df64a5a4a00edbd1a44065d395e3fec4e2b94f54df7"} Apr 20 22:25:08.198502 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.198369 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" event={"ID":"5ac8a623-4817-4b58-9c3f-57dce933db29","Type":"ContainerStarted","Data":"64f5adc60c372e28e2ed795b7e3e7beae3d9827166304a590c314772eeaa31b2"} Apr 20 22:25:08.199691 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.199662 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5756b" event={"ID":"c1e78fef-7128-47b2-a77d-46a98bb24af9","Type":"ContainerStarted","Data":"5d4d492a2e8ba8885ff71e18a5054e67192791441f8337222b958e2201f1ca41"} Apr 20 22:25:08.201222 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.201178 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-khz6t" event={"ID":"09fe9e71-2821-4229-a355-e118b4e9f593","Type":"ContainerStarted","Data":"531a58c9ea64c6cd9483e87e8ddcd146c1c22b2a377f27b2001a7f89ceb8b81e"} Apr 20 22:25:08.201222 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.201214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-khz6t" event={"ID":"09fe9e71-2821-4229-a355-e118b4e9f593","Type":"ContainerStarted","Data":"0ee2634e57525881a203bbbd523675fda4fba5cfb6c50c140af4a6fee91b276a"} Apr 20 22:25:08.203879 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.203851 2568 generic.go:358] "Generic (PLEG): container finished" podID="e3a6b05e-7ccd-4812-b0a3-5860098b7618" containerID="56048344d51b6c4eab5adf9e795557c573b2c9aa77fcd880e0555bb0c85d709e" exitCode=0 Apr 20 22:25:08.204002 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.203945 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npn9" event={"ID":"e3a6b05e-7ccd-4812-b0a3-5860098b7618","Type":"ContainerDied","Data":"56048344d51b6c4eab5adf9e795557c573b2c9aa77fcd880e0555bb0c85d709e"} Apr 20 22:25:08.205750 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:08.205209 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" event={"ID":"bae61085-f01a-4979-8495-49df502b51b9","Type":"ContainerStarted","Data":"98d15540ba87dbb8ca101f920fe53ba008801add2a1b3c3ab938654ef906d8e7"} Apr 20 22:25:09.411826 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.411708 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-65c5db58f9-mftmf"] Apr 20 22:25:09.438853 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.438805 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-65c5db58f9-mftmf"] Apr 20 22:25:09.439070 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.439048 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.441993 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.441961 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 22:25:09.442180 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.442026 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 22:25:09.442180 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.441961 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 22:25:09.442180 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.442102 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-flkrhhv11kn41\"" Apr 20 22:25:09.442351 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.442266 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 22:25:09.442399 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.442364 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-jg2rp\"" Apr 20 22:25:09.442447 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.442437 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 22:25:09.565908 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.565686 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66fwq\" (UniqueName: \"kubernetes.io/projected/05ee4b26-952f-4609-bd5e-75d703d80bf3-kube-api-access-66fwq\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.566089 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.565934 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.566089 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.565999 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.566089 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.566029 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05ee4b26-952f-4609-bd5e-75d703d80bf3-metrics-client-ca\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.566089 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.566070 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.566279 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.566178 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.566279 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.566228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-grpc-tls\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.566279 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.566262 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-tls\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.666756 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.666665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.666756 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.666711 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.666756 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.666742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05ee4b26-952f-4609-bd5e-75d703d80bf3-metrics-client-ca\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.667013 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.666780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.667013 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.666837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.667013 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.666864 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-grpc-tls\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.667013 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.666899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-tls\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.667013 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.666962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66fwq\" (UniqueName: \"kubernetes.io/projected/05ee4b26-952f-4609-bd5e-75d703d80bf3-kube-api-access-66fwq\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.668715 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.668467 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05ee4b26-952f-4609-bd5e-75d703d80bf3-metrics-client-ca\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.672922 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.672887 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.673080 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.672886 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.673080 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.672961 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-grpc-tls\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.673334 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.673309 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.673408 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.673388 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-tls\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.673742 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.673702 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05ee4b26-952f-4609-bd5e-75d703d80bf3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.675538 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.675515 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66fwq\" (UniqueName: \"kubernetes.io/projected/05ee4b26-952f-4609-bd5e-75d703d80bf3-kube-api-access-66fwq\") pod \"thanos-querier-65c5db58f9-mftmf\" (UID: \"05ee4b26-952f-4609-bd5e-75d703d80bf3\") " pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:09.750679 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:09.750636 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:10.730301 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.730271 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74b9ccffd-2bt72"] Apr 20 22:25:10.761179 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.760495 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b9ccffd-2bt72"] Apr 20 22:25:10.761179 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.760631 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.765983 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.765822 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 22:25:10.765983 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.765855 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 22:25:10.767630 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.767109 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 22:25:10.767630 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.767133 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 22:25:10.767630 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.767123 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 22:25:10.767630 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.767120 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 22:25:10.767630 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.767518 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-dn9dg\"" Apr 20 22:25:10.767630 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.767623 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 22:25:10.774629 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.774589 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 22:25:10.875182 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.874966 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5c868c9cbc-68w4d"] Apr 20 22:25:10.877407 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.877332 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-oauth-config\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.878429 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.877992 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-service-ca\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.878429 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.878053 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-oauth-serving-cert\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.878429 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.878106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-serving-cert\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.878429 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.878160 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-config\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.878429 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.878230 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-trusted-ca-bundle\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.878429 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.878271 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbc7f\" (UniqueName: \"kubernetes.io/projected/7f78bc43-5ed8-446b-83d7-02dc39db2008-kube-api-access-rbc7f\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.895424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.895372 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-65c5db58f9-mftmf"] Apr 20 22:25:10.895424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.895408 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5c868c9cbc-68w4d"] Apr 20 22:25:10.895559 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.895546 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:10.899899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.899264 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 22:25:10.899899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.899329 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 22:25:10.899899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.899496 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 22:25:10.899899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.899599 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4erc5cj8sq2v8\"" Apr 20 22:25:10.899899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.899742 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 22:25:10.899899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.899775 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-bj5fd\"" Apr 20 22:25:10.978992 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.978892 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-audit-log\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:10.978992 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.978956 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbc7f\" (UniqueName: \"kubernetes.io/projected/7f78bc43-5ed8-446b-83d7-02dc39db2008-kube-api-access-rbc7f\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.978992 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.978988 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-oauth-config\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979024 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-service-ca\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-oauth-serving-cert\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979089 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-client-ca-bundle\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979138 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-secret-metrics-server-tls\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-serving-cert\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979223 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-secret-metrics-server-client-certs\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979248 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-metrics-server-audit-profiles\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979282 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-config\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp7wg\" (UniqueName: \"kubernetes.io/projected/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-kube-api-access-mp7wg\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979368 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:10.979514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.979406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-trusted-ca-bundle\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.981310 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.980600 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-trusted-ca-bundle\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.981310 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.981221 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-oauth-serving-cert\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.981310 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.981264 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-config\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.981793 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.981532 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-service-ca\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.983077 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.983031 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-oauth-config\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.983690 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.983668 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-serving-cert\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:10.989515 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:10.989483 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbc7f\" (UniqueName: \"kubernetes.io/projected/7f78bc43-5ed8-446b-83d7-02dc39db2008-kube-api-access-rbc7f\") pod \"console-74b9ccffd-2bt72\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:11.076321 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.076131 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:11.080461 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.080423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-client-ca-bundle\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.080594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.080485 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-secret-metrics-server-tls\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.080594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.080525 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-secret-metrics-server-client-certs\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.080594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.080549 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-metrics-server-audit-profiles\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.080594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.080581 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp7wg\" (UniqueName: \"kubernetes.io/projected/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-kube-api-access-mp7wg\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.080834 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.080642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.080834 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.080679 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-audit-log\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.081096 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.081071 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-audit-log\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.081953 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.081899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-metrics-server-audit-profiles\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.082279 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.082251 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.083339 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.083312 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-secret-metrics-server-tls\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.083416 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.083373 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-secret-metrics-server-client-certs\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.083537 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.083511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-client-ca-bundle\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.088606 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.088579 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp7wg\" (UniqueName: \"kubernetes.io/projected/c2b4944e-42df-4cc7-a7fa-55aff7e04fbf-kube-api-access-mp7wg\") pod \"metrics-server-5c868c9cbc-68w4d\" (UID: \"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf\") " pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.127144 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:11.127105 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05ee4b26_952f_4609_bd5e_75d703d80bf3.slice/crio-89027852569b45ff8a87a6e10afce425df8bb0ee5cc1d81b5dafda67bc95824b WatchSource:0}: Error finding container 89027852569b45ff8a87a6e10afce425df8bb0ee5cc1d81b5dafda67bc95824b: Status 404 returned error can't find the container with id 89027852569b45ff8a87a6e10afce425df8bb0ee5cc1d81b5dafda67bc95824b Apr 20 22:25:11.229765 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.225423 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7"] Apr 20 22:25:11.234000 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.232433 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:11.242823 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.242704 2568 generic.go:358] "Generic (PLEG): container finished" podID="e3a6b05e-7ccd-4812-b0a3-5860098b7618" containerID="74b46411a71226831f054b2f905b9c05d058d1ba48b17a4753ac4c716811fd1a" exitCode=0 Apr 20 22:25:11.245332 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.245265 2568 generic.go:358] "Generic (PLEG): container finished" podID="84a62217-01ac-4867-83c4-e5586c70021c" containerID="6953527c9941bfe1febbe8b7708e32a1c971a066c1d8ea1cf5daaefd64bbd695" exitCode=0 Apr 20 22:25:11.260389 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.260352 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7"] Apr 20 22:25:11.260517 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.260394 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npn9" event={"ID":"e3a6b05e-7ccd-4812-b0a3-5860098b7618","Type":"ContainerDied","Data":"74b46411a71226831f054b2f905b9c05d058d1ba48b17a4753ac4c716811fd1a"} Apr 20 22:25:11.260517 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.260422 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6jbt7" event={"ID":"84a62217-01ac-4867-83c4-e5586c70021c","Type":"ContainerDied","Data":"6953527c9941bfe1febbe8b7708e32a1c971a066c1d8ea1cf5daaefd64bbd695"} Apr 20 22:25:11.260517 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.260438 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" event={"ID":"bae61085-f01a-4979-8495-49df502b51b9","Type":"ContainerStarted","Data":"4df3f8b00e74e7b4a107afadc9d24c725d4f7ca0add5bd9b3a3854f12bb2271d"} Apr 20 22:25:11.260517 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.260453 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" event={"ID":"05ee4b26-952f-4609-bd5e-75d703d80bf3","Type":"ContainerStarted","Data":"89027852569b45ff8a87a6e10afce425df8bb0ee5cc1d81b5dafda67bc95824b"} Apr 20 22:25:11.260829 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.260572 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" Apr 20 22:25:11.262319 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.262260 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-57zvt" event={"ID":"d07f11ad-2096-40a1-9534-a3146fa93510","Type":"ContainerStarted","Data":"c326acd566499d1c27205d75942ff5b357e9c96ada0c0982efb6900693061e27"} Apr 20 22:25:11.265042 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.263968 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-w5hkp\"" Apr 20 22:25:11.265042 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.264285 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 22:25:11.271933 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.270354 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5756b" event={"ID":"c1e78fef-7128-47b2-a77d-46a98bb24af9","Type":"ContainerStarted","Data":"5944d52a9c3195a4ae1b1ca7dd0e3529e2c4af86d3070b0ea1064593b338e5ec"} Apr 20 22:25:11.314709 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.308341 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5756b" podStartSLOduration=2.835212266 podStartE2EDuration="6.308320445s" podCreationTimestamp="2026-04-20 22:25:05 +0000 UTC" firstStartedPulling="2026-04-20 22:25:07.25698924 +0000 UTC m=+34.930176903" lastFinishedPulling="2026-04-20 22:25:10.730097405 +0000 UTC m=+38.403285082" observedRunningTime="2026-04-20 22:25:11.307810386 +0000 UTC m=+38.980998160" watchObservedRunningTime="2026-04-20 22:25:11.308320445 +0000 UTC m=+38.981508130" Apr 20 22:25:11.322623 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.322462 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74b9ccffd-2bt72"] Apr 20 22:25:11.327310 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:11.327204 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f78bc43_5ed8_446b_83d7_02dc39db2008.slice/crio-ccdd8d7d12a4b6dcdba8653eaa2058d737aa8d417e071b8d579a523b67916086 WatchSource:0}: Error finding container ccdd8d7d12a4b6dcdba8653eaa2058d737aa8d417e071b8d579a523b67916086: Status 404 returned error can't find the container with id ccdd8d7d12a4b6dcdba8653eaa2058d737aa8d417e071b8d579a523b67916086 Apr 20 22:25:11.394211 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.391605 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e9da1b1c-c4fb-4597-9667-f377d36939d7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-22wf7\" (UID: \"e9da1b1c-c4fb-4597-9667-f377d36939d7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" Apr 20 22:25:11.492955 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.492764 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5c868c9cbc-68w4d"] Apr 20 22:25:11.493229 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.493041 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e9da1b1c-c4fb-4597-9667-f377d36939d7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-22wf7\" (UID: \"e9da1b1c-c4fb-4597-9667-f377d36939d7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" Apr 20 22:25:11.497644 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.497608 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/e9da1b1c-c4fb-4597-9667-f377d36939d7-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-22wf7\" (UID: \"e9da1b1c-c4fb-4597-9667-f377d36939d7\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" Apr 20 22:25:11.498628 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:11.498597 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b4944e_42df_4cc7_a7fa_55aff7e04fbf.slice/crio-770756344c7bdbd5a5114c98bf3e1948222c91ac63914336083403db2a63611c WatchSource:0}: Error finding container 770756344c7bdbd5a5114c98bf3e1948222c91ac63914336083403db2a63611c: Status 404 returned error can't find the container with id 770756344c7bdbd5a5114c98bf3e1948222c91ac63914336083403db2a63611c Apr 20 22:25:11.620931 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.620899 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" Apr 20 22:25:11.747414 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:11.747381 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7"] Apr 20 22:25:11.756624 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:11.756582 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9da1b1c_c4fb_4597_9667_f377d36939d7.slice/crio-f7a9c68e0206fca30541d4787e41dc2d2cf3bf1e50941eb5d5a612ccaaff9c0a WatchSource:0}: Error finding container f7a9c68e0206fca30541d4787e41dc2d2cf3bf1e50941eb5d5a612ccaaff9c0a: Status 404 returned error can't find the container with id f7a9c68e0206fca30541d4787e41dc2d2cf3bf1e50941eb5d5a612ccaaff9c0a Apr 20 22:25:12.299701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.299090 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-57zvt" event={"ID":"d07f11ad-2096-40a1-9534-a3146fa93510","Type":"ContainerStarted","Data":"156ba88f7cfc6c9dace549533aef786a9539ff33b2f2cd9fd1a1bb6c4a82105b"} Apr 20 22:25:12.299701 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.299417 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:12.305120 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.304397 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" event={"ID":"5ac8a623-4817-4b58-9c3f-57dce933db29","Type":"ContainerStarted","Data":"a1ec0335e858f580bf5eb990a7cf0a2d6524ec4997ef80a37903691c29c88c4e"} Apr 20 22:25:12.311137 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.311094 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-khz6t" event={"ID":"09fe9e71-2821-4229-a355-e118b4e9f593","Type":"ContainerStarted","Data":"e8d71dd900cec475348cf786e6aa353a3b2f45d0e1b981edcecc3f742261be1a"} Apr 20 22:25:12.316135 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.316086 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4npn9" event={"ID":"e3a6b05e-7ccd-4812-b0a3-5860098b7618","Type":"ContainerStarted","Data":"2752cce29ef9d63ca15755200630a30b1699f0e237c7e5bec823b9e74582bee3"} Apr 20 22:25:12.317418 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.317367 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-57zvt" podStartSLOduration=3.860829308 podStartE2EDuration="7.317350443s" podCreationTimestamp="2026-04-20 22:25:05 +0000 UTC" firstStartedPulling="2026-04-20 22:25:07.27325196 +0000 UTC m=+34.946439633" lastFinishedPulling="2026-04-20 22:25:10.729773092 +0000 UTC m=+38.402960768" observedRunningTime="2026-04-20 22:25:12.316853288 +0000 UTC m=+39.990040973" watchObservedRunningTime="2026-04-20 22:25:12.317350443 +0000 UTC m=+39.990538128" Apr 20 22:25:12.320202 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.320128 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6jbt7" event={"ID":"84a62217-01ac-4867-83c4-e5586c70021c","Type":"ContainerStarted","Data":"1720d898fdc18843175167c4b3d2852470bfc20f55c19040111b2cb3b036ed18"} Apr 20 22:25:12.320202 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.320178 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6jbt7" event={"ID":"84a62217-01ac-4867-83c4-e5586c70021c","Type":"ContainerStarted","Data":"a3349be54339a78523fb748a6949a92c5e5fe36d3d59181ddcda34d4219eec0b"} Apr 20 22:25:12.322702 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.322671 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b9ccffd-2bt72" event={"ID":"7f78bc43-5ed8-446b-83d7-02dc39db2008","Type":"ContainerStarted","Data":"ccdd8d7d12a4b6dcdba8653eaa2058d737aa8d417e071b8d579a523b67916086"} Apr 20 22:25:12.327112 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.327058 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" event={"ID":"bae61085-f01a-4979-8495-49df502b51b9","Type":"ContainerStarted","Data":"9c3c468a650606e11af6c847a9df02d0523bcb145d58dde8fa5106ff0e2b3069"} Apr 20 22:25:12.327112 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.327095 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" event={"ID":"bae61085-f01a-4979-8495-49df502b51b9","Type":"ContainerStarted","Data":"94f0fc999bf490f2c1cbc1239c88499de0a7a35d7ce45cb4773da73054997766"} Apr 20 22:25:12.329800 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.329761 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" event={"ID":"e9da1b1c-c4fb-4597-9667-f377d36939d7","Type":"ContainerStarted","Data":"f7a9c68e0206fca30541d4787e41dc2d2cf3bf1e50941eb5d5a612ccaaff9c0a"} Apr 20 22:25:12.331206 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.331141 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" event={"ID":"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf","Type":"ContainerStarted","Data":"770756344c7bdbd5a5114c98bf3e1948222c91ac63914336083403db2a63611c"} Apr 20 22:25:12.337515 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.337457 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-8kxkw" podStartSLOduration=2.8087361140000002 podStartE2EDuration="6.337435632s" podCreationTimestamp="2026-04-20 22:25:06 +0000 UTC" firstStartedPulling="2026-04-20 22:25:07.593425086 +0000 UTC m=+35.266612752" lastFinishedPulling="2026-04-20 22:25:11.122124607 +0000 UTC m=+38.795312270" observedRunningTime="2026-04-20 22:25:12.336440932 +0000 UTC m=+40.009628621" watchObservedRunningTime="2026-04-20 22:25:12.337435632 +0000 UTC m=+40.010623319" Apr 20 22:25:12.366791 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.365851 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4npn9" podStartSLOduration=6.247018393 podStartE2EDuration="39.365833036s" podCreationTimestamp="2026-04-20 22:24:33 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.177224747 +0000 UTC m=+1.850412414" lastFinishedPulling="2026-04-20 22:25:07.296039381 +0000 UTC m=+34.969227057" observedRunningTime="2026-04-20 22:25:12.365429622 +0000 UTC m=+40.038617308" watchObservedRunningTime="2026-04-20 22:25:12.365833036 +0000 UTC m=+40.039020722" Apr 20 22:25:12.384183 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.383187 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-f2prt" podStartSLOduration=3.198577104 podStartE2EDuration="6.38316463s" podCreationTimestamp="2026-04-20 22:25:06 +0000 UTC" firstStartedPulling="2026-04-20 22:25:07.54501779 +0000 UTC m=+35.218205457" lastFinishedPulling="2026-04-20 22:25:10.729605306 +0000 UTC m=+38.402792983" observedRunningTime="2026-04-20 22:25:12.381663362 +0000 UTC m=+40.054851055" watchObservedRunningTime="2026-04-20 22:25:12.38316463 +0000 UTC m=+40.056352313" Apr 20 22:25:12.400727 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.399925 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6jbt7" podStartSLOduration=2.687053408 podStartE2EDuration="6.399902258s" podCreationTimestamp="2026-04-20 22:25:06 +0000 UTC" firstStartedPulling="2026-04-20 22:25:07.012589165 +0000 UTC m=+34.685776843" lastFinishedPulling="2026-04-20 22:25:10.725438014 +0000 UTC m=+38.398625693" observedRunningTime="2026-04-20 22:25:12.39927592 +0000 UTC m=+40.072463606" watchObservedRunningTime="2026-04-20 22:25:12.399902258 +0000 UTC m=+40.073089945" Apr 20 22:25:12.645990 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.643881 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 22:25:12.668847 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.668795 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 22:25:12.669071 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.669042 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.680181 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.680024 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 22:25:12.683818 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.680611 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 22:25:12.683818 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.680899 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 22:25:12.683818 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.681271 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 22:25:12.683818 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.681500 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 22:25:12.683818 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.681703 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 22:25:12.686120 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.685184 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 22:25:12.686120 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.685485 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bgi9927i5vdvg\"" Apr 20 22:25:12.686120 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.685728 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 22:25:12.686120 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.685911 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n8wr7\"" Apr 20 22:25:12.686498 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.686310 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 22:25:12.688271 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.686601 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 22:25:12.691342 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.689951 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 22:25:12.691342 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.691101 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 22:25:12.804684 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.804600 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.804684 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.804651 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.804692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.804843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config-out\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.804893 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.804924 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.804952 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.804981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805016 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805051 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805096 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805132 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805244 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805217 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805772 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805265 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2642m\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-kube-api-access-2642m\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805772 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805296 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805772 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-web-config\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805772 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805366 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.805772 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.805403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.905902 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.905815 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.905902 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.905866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906091 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.905903 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906091 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.905939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906091 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.905964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906091 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906020 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config-out\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906091 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906055 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906091 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906348 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906106 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906348 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906348 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906180 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906348 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906348 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906251 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906348 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906275 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906348 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906313 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906658 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2642m\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-kube-api-access-2642m\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906658 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.906658 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.906401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-web-config\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.910021 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.907771 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.910021 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.908088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.910021 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.908708 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.910021 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.908943 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.916620 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.912334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.922274 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.918307 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2642m\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-kube-api-access-2642m\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.922274 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.919263 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.922274 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.920037 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config-out\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.922274 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.920604 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.922274 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.920707 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.922274 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.921081 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.922274 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.921851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.926262 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.923710 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.926262 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.925684 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.926262 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.925705 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.926262 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.925798 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.926262 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.925949 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:12.929134 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:12.928673 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-web-config\") pod \"prometheus-k8s-0\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:13.003559 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:13.003520 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:16.981671 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:16.981643 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 22:25:16.984087 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:16.984056 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ce82e00_8ebc_4c6f_8cd4_172874b459cb.slice/crio-0aec9e7b4f8e47beaad5616c564d19d8dbb2988f57175fe9678288c175ae077e WatchSource:0}: Error finding container 0aec9e7b4f8e47beaad5616c564d19d8dbb2988f57175fe9678288c175ae077e: Status 404 returned error can't find the container with id 0aec9e7b4f8e47beaad5616c564d19d8dbb2988f57175fe9678288c175ae077e Apr 20 22:25:17.347292 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.347186 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-khz6t" event={"ID":"09fe9e71-2821-4229-a355-e118b4e9f593","Type":"ContainerStarted","Data":"55eb47291f6a47c3b5b299866ea4568b5e3fc6fd302123e4e78d76d06e3a19d7"} Apr 20 22:25:17.348816 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.348781 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b9ccffd-2bt72" event={"ID":"7f78bc43-5ed8-446b-83d7-02dc39db2008","Type":"ContainerStarted","Data":"7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071"} Apr 20 22:25:17.350801 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.350773 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" event={"ID":"05ee4b26-952f-4609-bd5e-75d703d80bf3","Type":"ContainerStarted","Data":"5030623d1f273da08ce24de4c7ccbf0fe5a6e624f668e0c1afd7370a7bf4161a"} Apr 20 22:25:17.350952 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.350811 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" event={"ID":"05ee4b26-952f-4609-bd5e-75d703d80bf3","Type":"ContainerStarted","Data":"969782fbfcf7561a809452a3f30d5c848a88d5d972374faf034be2e2cdb9b63f"} Apr 20 22:25:17.350952 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.350826 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" event={"ID":"05ee4b26-952f-4609-bd5e-75d703d80bf3","Type":"ContainerStarted","Data":"bf73c7365e604ccf3effdda98a4c8e10c87bcc5f1103ce69c5995fcc12dcc9a8"} Apr 20 22:25:17.352339 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.352298 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" event={"ID":"e9da1b1c-c4fb-4597-9667-f377d36939d7","Type":"ContainerStarted","Data":"1eb1b40b1d41f229643b49bc390e6a98e16a1b3abbc36ca5deccb51142e5abff"} Apr 20 22:25:17.352504 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.352488 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" Apr 20 22:25:17.353725 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.353701 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" event={"ID":"c2b4944e-42df-4cc7-a7fa-55aff7e04fbf","Type":"ContainerStarted","Data":"9ef496f5bbdd1ca523821c4dc567d5d323d70a418f1d284bc2ead34b30793f7f"} Apr 20 22:25:17.354923 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.354898 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerStarted","Data":"0aec9e7b4f8e47beaad5616c564d19d8dbb2988f57175fe9678288c175ae077e"} Apr 20 22:25:17.358265 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.358246 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" Apr 20 22:25:17.364261 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.364222 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-khz6t" podStartSLOduration=2.9449183679999997 podStartE2EDuration="12.364209628s" podCreationTimestamp="2026-04-20 22:25:05 +0000 UTC" firstStartedPulling="2026-04-20 22:25:07.414752325 +0000 UTC m=+35.087940002" lastFinishedPulling="2026-04-20 22:25:16.834043585 +0000 UTC m=+44.507231262" observedRunningTime="2026-04-20 22:25:17.363458466 +0000 UTC m=+45.036646151" watchObservedRunningTime="2026-04-20 22:25:17.364209628 +0000 UTC m=+45.037397307" Apr 20 22:25:17.379199 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.379133 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74b9ccffd-2bt72" podStartSLOduration=1.890625514 podStartE2EDuration="7.379118344s" podCreationTimestamp="2026-04-20 22:25:10 +0000 UTC" firstStartedPulling="2026-04-20 22:25:11.352843053 +0000 UTC m=+39.026030727" lastFinishedPulling="2026-04-20 22:25:16.84133589 +0000 UTC m=+44.514523557" observedRunningTime="2026-04-20 22:25:17.377959677 +0000 UTC m=+45.051147363" watchObservedRunningTime="2026-04-20 22:25:17.379118344 +0000 UTC m=+45.052306029" Apr 20 22:25:17.392900 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.392844 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" podStartSLOduration=2.056970718 podStartE2EDuration="7.392825484s" podCreationTimestamp="2026-04-20 22:25:10 +0000 UTC" firstStartedPulling="2026-04-20 22:25:11.500249565 +0000 UTC m=+39.173437228" lastFinishedPulling="2026-04-20 22:25:16.836104328 +0000 UTC m=+44.509291994" observedRunningTime="2026-04-20 22:25:17.392074223 +0000 UTC m=+45.065261910" watchObservedRunningTime="2026-04-20 22:25:17.392825484 +0000 UTC m=+45.066013170" Apr 20 22:25:17.407753 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:17.407685 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-22wf7" podStartSLOduration=1.331543348 podStartE2EDuration="6.407665613s" podCreationTimestamp="2026-04-20 22:25:11 +0000 UTC" firstStartedPulling="2026-04-20 22:25:11.759226506 +0000 UTC m=+39.432414184" lastFinishedPulling="2026-04-20 22:25:16.835348783 +0000 UTC m=+44.508536449" observedRunningTime="2026-04-20 22:25:17.406683964 +0000 UTC m=+45.079871651" watchObservedRunningTime="2026-04-20 22:25:17.407665613 +0000 UTC m=+45.080853417" Apr 20 22:25:19.363906 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:19.363867 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerID="24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18" exitCode=0 Apr 20 22:25:19.364406 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:19.363936 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerDied","Data":"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18"} Apr 20 22:25:19.367090 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:19.367049 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" event={"ID":"05ee4b26-952f-4609-bd5e-75d703d80bf3","Type":"ContainerStarted","Data":"ed4dfdf47ea6000d8db086bc0d73e111269d6bfc6ca1a46f1432cd207d3a4060"} Apr 20 22:25:19.367090 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:19.367091 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" event={"ID":"05ee4b26-952f-4609-bd5e-75d703d80bf3","Type":"ContainerStarted","Data":"760d17c2f53c98dfe3fbf3b79d852f07fe10a80ff6ac4719004ba5906d1940f2"} Apr 20 22:25:19.367298 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:19.367105 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" event={"ID":"05ee4b26-952f-4609-bd5e-75d703d80bf3","Type":"ContainerStarted","Data":"21959a2a80ec4d1a40311a8704241afbb7371ba5108065c41b6b8b4f0737b853"} Apr 20 22:25:19.407314 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:19.407249 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" podStartSLOduration=2.826215993 podStartE2EDuration="10.407231384s" podCreationTimestamp="2026-04-20 22:25:09 +0000 UTC" firstStartedPulling="2026-04-20 22:25:11.129481207 +0000 UTC m=+38.802668877" lastFinishedPulling="2026-04-20 22:25:18.71049659 +0000 UTC m=+46.383684268" observedRunningTime="2026-04-20 22:25:19.405922628 +0000 UTC m=+47.079110349" watchObservedRunningTime="2026-04-20 22:25:19.407231384 +0000 UTC m=+47.080419118" Apr 20 22:25:19.521759 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:19.521720 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b9ccffd-2bt72"] Apr 20 22:25:20.370457 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:20.370372 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:21.076717 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:21.076679 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:22.336914 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:22.336878 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-57zvt" Apr 20 22:25:23.382227 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:23.382186 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerStarted","Data":"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8"} Apr 20 22:25:23.382649 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:23.382238 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerStarted","Data":"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad"} Apr 20 22:25:23.382649 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:23.382258 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerStarted","Data":"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98"} Apr 20 22:25:24.387576 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:24.387536 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerStarted","Data":"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c"} Apr 20 22:25:24.387948 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:24.387581 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerStarted","Data":"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f"} Apr 20 22:25:24.387948 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:24.387597 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerStarted","Data":"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b"} Apr 20 22:25:24.420107 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:24.420057 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=6.503571711 podStartE2EDuration="12.420042097s" podCreationTimestamp="2026-04-20 22:25:12 +0000 UTC" firstStartedPulling="2026-04-20 22:25:16.986299324 +0000 UTC m=+44.659487004" lastFinishedPulling="2026-04-20 22:25:22.902769727 +0000 UTC m=+50.575957390" observedRunningTime="2026-04-20 22:25:24.417742573 +0000 UTC m=+52.090930261" watchObservedRunningTime="2026-04-20 22:25:24.420042097 +0000 UTC m=+52.093229782" Apr 20 22:25:26.378790 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:26.378761 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-65c5db58f9-mftmf" Apr 20 22:25:28.004516 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:28.004469 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:31.233202 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:31.233143 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:31.233202 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:31.233207 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:32.192499 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:32.192469 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r2zfd" Apr 20 22:25:37.529924 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.529792 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:25:37.532928 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.532904 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 22:25:37.542472 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.542438 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6b93d87-66d5-4f06-b428-6cbc7fcdeda2-metrics-certs\") pod \"network-metrics-daemon-5kgfv\" (UID: \"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2\") " pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:25:37.728550 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.728516 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddtmq\"" Apr 20 22:25:37.731603 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.731580 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:25:37.733698 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.733678 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 22:25:37.736707 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.736688 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5kgfv" Apr 20 22:25:37.744557 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.744520 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 22:25:37.755360 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.755334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz2jv\" (UniqueName: \"kubernetes.io/projected/a761dc14-770d-43e4-b87c-68589f057961-kube-api-access-cz2jv\") pod \"network-check-target-shrcq\" (UID: \"a761dc14-770d-43e4-b87c-68589f057961\") " pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:25:37.857563 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:37.857537 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5kgfv"] Apr 20 22:25:37.860222 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:37.860188 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6b93d87_66d5_4f06_b428_6cbc7fcdeda2.slice/crio-30c9ec1ae61d39e8e7cfc8782ddd6fdd782f0a18c2bf4106f46abceb4aaa11f2 WatchSource:0}: Error finding container 30c9ec1ae61d39e8e7cfc8782ddd6fdd782f0a18c2bf4106f46abceb4aaa11f2: Status 404 returned error can't find the container with id 30c9ec1ae61d39e8e7cfc8782ddd6fdd782f0a18c2bf4106f46abceb4aaa11f2 Apr 20 22:25:38.023163 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:38.023110 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-66fzs\"" Apr 20 22:25:38.031243 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:38.031214 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:25:38.152310 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:38.152252 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-shrcq"] Apr 20 22:25:38.155938 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:25:38.155902 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda761dc14_770d_43e4_b87c_68589f057961.slice/crio-1cbccb186b6191bd90611bd1890af374143390b6e5aabbdf05011225ea214581 WatchSource:0}: Error finding container 1cbccb186b6191bd90611bd1890af374143390b6e5aabbdf05011225ea214581: Status 404 returned error can't find the container with id 1cbccb186b6191bd90611bd1890af374143390b6e5aabbdf05011225ea214581 Apr 20 22:25:38.433004 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:38.432960 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5kgfv" event={"ID":"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2","Type":"ContainerStarted","Data":"30c9ec1ae61d39e8e7cfc8782ddd6fdd782f0a18c2bf4106f46abceb4aaa11f2"} Apr 20 22:25:38.434222 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:38.434187 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-shrcq" event={"ID":"a761dc14-770d-43e4-b87c-68589f057961","Type":"ContainerStarted","Data":"1cbccb186b6191bd90611bd1890af374143390b6e5aabbdf05011225ea214581"} Apr 20 22:25:40.442459 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:40.442420 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5kgfv" event={"ID":"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2","Type":"ContainerStarted","Data":"af3603025a01f8587cd9e291df952c34499d3125cdd00495803c24c0e706afe5"} Apr 20 22:25:40.442459 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:40.442462 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5kgfv" event={"ID":"a6b93d87-66d5-4f06-b428-6cbc7fcdeda2","Type":"ContainerStarted","Data":"1c7a9f04a69a4795cf526ba6c7aa9b04fc48828214767a31fa3e0b74c7922cbf"} Apr 20 22:25:40.458117 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:40.458061 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5kgfv" podStartSLOduration=67.04434073 podStartE2EDuration="1m8.458044628s" podCreationTimestamp="2026-04-20 22:24:32 +0000 UTC" firstStartedPulling="2026-04-20 22:25:37.862204586 +0000 UTC m=+65.535392249" lastFinishedPulling="2026-04-20 22:25:39.275908471 +0000 UTC m=+66.949096147" observedRunningTime="2026-04-20 22:25:40.45704962 +0000 UTC m=+68.130237349" watchObservedRunningTime="2026-04-20 22:25:40.458044628 +0000 UTC m=+68.131232334" Apr 20 22:25:42.330112 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:42.330065 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:42.399552 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:42.399521 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:42.450798 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:42.450753 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-shrcq" event={"ID":"a761dc14-770d-43e4-b87c-68589f057961","Type":"ContainerStarted","Data":"2be4fba9a05dbd9a38a2c9d321eee9efad256153b2a49acde5e717ca57a1bf5e"} Apr 20 22:25:42.451000 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:42.450928 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:25:42.466195 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:42.466119 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-shrcq" podStartSLOduration=66.311473421 podStartE2EDuration="1m10.466099241s" podCreationTimestamp="2026-04-20 22:24:32 +0000 UTC" firstStartedPulling="2026-04-20 22:25:38.158222886 +0000 UTC m=+65.831410552" lastFinishedPulling="2026-04-20 22:25:42.312848694 +0000 UTC m=+69.986036372" observedRunningTime="2026-04-20 22:25:42.464635708 +0000 UTC m=+70.137823406" watchObservedRunningTime="2026-04-20 22:25:42.466099241 +0000 UTC m=+70.139286962" Apr 20 22:25:42.467316 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:42.467297 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:25:44.541355 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.541300 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74b9ccffd-2bt72" podUID="7f78bc43-5ed8-446b-83d7-02dc39db2008" containerName="console" containerID="cri-o://7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071" gracePeriod=15 Apr 20 22:25:44.832388 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.832363 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b9ccffd-2bt72_7f78bc43-5ed8-446b-83d7-02dc39db2008/console/0.log" Apr 20 22:25:44.832541 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.832443 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:44.899537 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.899459 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-config\") pod \"7f78bc43-5ed8-446b-83d7-02dc39db2008\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " Apr 20 22:25:44.899980 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.899963 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-oauth-serving-cert\") pod \"7f78bc43-5ed8-446b-83d7-02dc39db2008\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " Apr 20 22:25:44.900023 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.899999 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-trusted-ca-bundle\") pod \"7f78bc43-5ed8-446b-83d7-02dc39db2008\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " Apr 20 22:25:44.900075 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900033 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-oauth-config\") pod \"7f78bc43-5ed8-446b-83d7-02dc39db2008\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " Apr 20 22:25:44.900113 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900071 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbc7f\" (UniqueName: \"kubernetes.io/projected/7f78bc43-5ed8-446b-83d7-02dc39db2008-kube-api-access-rbc7f\") pod \"7f78bc43-5ed8-446b-83d7-02dc39db2008\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " Apr 20 22:25:44.900187 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900112 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-serving-cert\") pod \"7f78bc43-5ed8-446b-83d7-02dc39db2008\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " Apr 20 22:25:44.900462 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900348 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-config" (OuterVolumeSpecName: "console-config") pod "7f78bc43-5ed8-446b-83d7-02dc39db2008" (UID: "7f78bc43-5ed8-446b-83d7-02dc39db2008"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:44.900462 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900361 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-service-ca\") pod \"7f78bc43-5ed8-446b-83d7-02dc39db2008\" (UID: \"7f78bc43-5ed8-446b-83d7-02dc39db2008\") " Apr 20 22:25:44.900462 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900421 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7f78bc43-5ed8-446b-83d7-02dc39db2008" (UID: "7f78bc43-5ed8-446b-83d7-02dc39db2008"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:44.900779 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900466 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7f78bc43-5ed8-446b-83d7-02dc39db2008" (UID: "7f78bc43-5ed8-446b-83d7-02dc39db2008"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:44.900779 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900625 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-config\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:25:44.900779 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900646 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-oauth-serving-cert\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:25:44.900779 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900660 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-trusted-ca-bundle\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:25:44.900779 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.900658 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-service-ca" (OuterVolumeSpecName: "service-ca") pod "7f78bc43-5ed8-446b-83d7-02dc39db2008" (UID: "7f78bc43-5ed8-446b-83d7-02dc39db2008"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:44.902661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.902627 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f78bc43-5ed8-446b-83d7-02dc39db2008-kube-api-access-rbc7f" (OuterVolumeSpecName: "kube-api-access-rbc7f") pod "7f78bc43-5ed8-446b-83d7-02dc39db2008" (UID: "7f78bc43-5ed8-446b-83d7-02dc39db2008"). InnerVolumeSpecName "kube-api-access-rbc7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:44.902789 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.902759 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7f78bc43-5ed8-446b-83d7-02dc39db2008" (UID: "7f78bc43-5ed8-446b-83d7-02dc39db2008"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:44.902789 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:44.902779 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7f78bc43-5ed8-446b-83d7-02dc39db2008" (UID: "7f78bc43-5ed8-446b-83d7-02dc39db2008"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:45.001220 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.001182 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-oauth-config\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:25:45.001220 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.001215 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbc7f\" (UniqueName: \"kubernetes.io/projected/7f78bc43-5ed8-446b-83d7-02dc39db2008-kube-api-access-rbc7f\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:25:45.001220 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.001229 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f78bc43-5ed8-446b-83d7-02dc39db2008-console-serving-cert\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:25:45.001459 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.001242 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f78bc43-5ed8-446b-83d7-02dc39db2008-service-ca\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:25:45.459926 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.459901 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74b9ccffd-2bt72_7f78bc43-5ed8-446b-83d7-02dc39db2008/console/0.log" Apr 20 22:25:45.460100 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.459941 2568 generic.go:358] "Generic (PLEG): container finished" podID="7f78bc43-5ed8-446b-83d7-02dc39db2008" containerID="7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071" exitCode=2 Apr 20 22:25:45.460100 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.460031 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74b9ccffd-2bt72" Apr 20 22:25:45.460100 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.460031 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b9ccffd-2bt72" event={"ID":"7f78bc43-5ed8-446b-83d7-02dc39db2008","Type":"ContainerDied","Data":"7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071"} Apr 20 22:25:45.460100 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.460082 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74b9ccffd-2bt72" event={"ID":"7f78bc43-5ed8-446b-83d7-02dc39db2008","Type":"ContainerDied","Data":"ccdd8d7d12a4b6dcdba8653eaa2058d737aa8d417e071b8d579a523b67916086"} Apr 20 22:25:45.460100 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.460099 2568 scope.go:117] "RemoveContainer" containerID="7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071" Apr 20 22:25:45.468500 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.468478 2568 scope.go:117] "RemoveContainer" containerID="7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071" Apr 20 22:25:45.468823 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:25:45.468803 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071\": container with ID starting with 7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071 not found: ID does not exist" containerID="7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071" Apr 20 22:25:45.468899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.468830 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071"} err="failed to get container status \"7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071\": rpc error: code = NotFound desc = could not find container \"7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071\": container with ID starting with 7f97fe9e74529c09be6daedee3377b439bb57507abd54db63a049b2e20f43071 not found: ID does not exist" Apr 20 22:25:45.477597 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.477568 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74b9ccffd-2bt72"] Apr 20 22:25:45.480984 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:45.480949 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74b9ccffd-2bt72"] Apr 20 22:25:46.913694 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:46.913662 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f78bc43-5ed8-446b-83d7-02dc39db2008" path="/var/lib/kubelet/pods/7f78bc43-5ed8-446b-83d7-02dc39db2008/volumes" Apr 20 22:25:51.238839 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:51.238810 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:25:51.242966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:25:51.242942 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5c868c9cbc-68w4d" Apr 20 22:26:00.971464 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:00.971427 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 22:26:00.971998 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:00.971846 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="prometheus" containerID="cri-o://5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" gracePeriod=600 Apr 20 22:26:00.971998 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:00.971895 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy" containerID="cri-o://c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" gracePeriod=600 Apr 20 22:26:00.971998 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:00.971916 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="thanos-sidecar" containerID="cri-o://dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" gracePeriod=600 Apr 20 22:26:00.971998 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:00.971933 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy-thanos" containerID="cri-o://09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" gracePeriod=600 Apr 20 22:26:00.971998 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:00.971953 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="config-reloader" containerID="cri-o://24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" gracePeriod=600 Apr 20 22:26:00.972317 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:00.971941 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy-web" containerID="cri-o://b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" gracePeriod=600 Apr 20 22:26:01.217423 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.217398 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.334635 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.334487 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-trusted-ca-bundle\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.334635 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.334627 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-metrics-client-certs\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.334899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.334659 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-metrics-client-ca\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.334899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.334688 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-tls-assets\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.334899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.334716 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-kubelet-serving-ca-bundle\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.334899 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.334855 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-grpc-tls\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.334919 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.334948 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-serving-certs-ca-bundle\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.334979 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config-out\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335003 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-tls\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335013 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335039 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-thanos-prometheus-http-client-file\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335066 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-rulefiles-0\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335106 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335133 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:26:01.335178 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335141 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-kube-rbac-proxy\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335683 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335310 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335683 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335373 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-db\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335683 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335399 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2642m\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-kube-api-access-2642m\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335683 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335398 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:26:01.335683 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335453 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-web-config\") pod \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\" (UID: \"1ce82e00-8ebc-4c6f-8cd4-172874b459cb\") " Apr 20 22:26:01.335917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335683 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.335917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335702 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.335917 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.335717 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.337130 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.337094 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:26:01.337750 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.337727 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:26:01.337964 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.337938 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:26:01.338046 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.337945 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:01.338602 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.338563 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:26:01.338946 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.338821 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:01.338946 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.338864 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:01.338946 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.338908 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config" (OuterVolumeSpecName: "config") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:01.338946 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.338922 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:01.339496 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.339471 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:01.339867 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.339836 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config-out" (OuterVolumeSpecName: "config-out") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:26:01.339977 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.339931 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-kube-api-access-2642m" (OuterVolumeSpecName: "kube-api-access-2642m") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "kube-api-access-2642m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:26:01.340437 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.340414 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:01.340506 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.340451 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:01.348878 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.348841 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-web-config" (OuterVolumeSpecName: "web-config") pod "1ce82e00-8ebc-4c6f-8cd4-172874b459cb" (UID: "1ce82e00-8ebc-4c6f-8cd4-172874b459cb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:26:01.436784 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436746 2568 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-metrics-client-certs\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.436784 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436775 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-configmap-metrics-client-ca\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.436784 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436785 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-tls-assets\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.436784 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436794 2568 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-grpc-tls\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436804 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436813 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config-out\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436825 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436835 2568 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436844 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436852 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436861 2568 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-secret-kube-rbac-proxy\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436869 2568 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-config\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436878 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-prometheus-k8s-db\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436887 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2642m\" (UniqueName: \"kubernetes.io/projected/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-kube-api-access-2642m\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.437098 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.436895 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ce82e00-8ebc-4c6f-8cd4-172874b459cb-web-config\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:26:01.511594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511556 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerID="09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" exitCode=0 Apr 20 22:26:01.511594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511579 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerID="c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" exitCode=0 Apr 20 22:26:01.511594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511585 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerID="b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" exitCode=0 Apr 20 22:26:01.511594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511591 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerID="dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" exitCode=0 Apr 20 22:26:01.511594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511596 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerID="24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" exitCode=0 Apr 20 22:26:01.511594 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511603 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerID="5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" exitCode=0 Apr 20 22:26:01.511918 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511617 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerDied","Data":"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c"} Apr 20 22:26:01.511918 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511659 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerDied","Data":"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f"} Apr 20 22:26:01.511918 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511661 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.511918 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511671 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerDied","Data":"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b"} Apr 20 22:26:01.511918 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511682 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerDied","Data":"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8"} Apr 20 22:26:01.511918 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511697 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerDied","Data":"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad"} Apr 20 22:26:01.511918 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511706 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerDied","Data":"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98"} Apr 20 22:26:01.511918 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511715 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1ce82e00-8ebc-4c6f-8cd4-172874b459cb","Type":"ContainerDied","Data":"0aec9e7b4f8e47beaad5616c564d19d8dbb2988f57175fe9678288c175ae077e"} Apr 20 22:26:01.511918 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.511728 2568 scope.go:117] "RemoveContainer" containerID="09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" Apr 20 22:26:01.519895 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.519860 2568 scope.go:117] "RemoveContainer" containerID="c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" Apr 20 22:26:01.527443 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.527421 2568 scope.go:117] "RemoveContainer" containerID="b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" Apr 20 22:26:01.534759 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.534727 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 22:26:01.535076 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.535057 2568 scope.go:117] "RemoveContainer" containerID="dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" Apr 20 22:26:01.539574 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.539549 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 22:26:01.543085 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.543063 2568 scope.go:117] "RemoveContainer" containerID="24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" Apr 20 22:26:01.550193 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.550144 2568 scope.go:117] "RemoveContainer" containerID="5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" Apr 20 22:26:01.557747 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.557725 2568 scope.go:117] "RemoveContainer" containerID="24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18" Apr 20 22:26:01.565125 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565091 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 22:26:01.565508 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565486 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="prometheus" Apr 20 22:26:01.565593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565513 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="prometheus" Apr 20 22:26:01.565593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565533 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy" Apr 20 22:26:01.565593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565543 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy" Apr 20 22:26:01.565593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565555 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy-thanos" Apr 20 22:26:01.565593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565565 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy-thanos" Apr 20 22:26:01.565593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565586 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy-web" Apr 20 22:26:01.565593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565597 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy-web" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565610 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="init-config-reloader" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565619 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="init-config-reloader" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565628 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="thanos-sidecar" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565637 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="thanos-sidecar" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565652 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f78bc43-5ed8-446b-83d7-02dc39db2008" containerName="console" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565659 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f78bc43-5ed8-446b-83d7-02dc39db2008" containerName="console" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565668 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="config-reloader" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565676 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="config-reloader" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565748 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy-thanos" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565761 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565770 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f78bc43-5ed8-446b-83d7-02dc39db2008" containerName="console" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565779 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="prometheus" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565787 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="config-reloader" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565797 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="thanos-sidecar" Apr 20 22:26:01.565909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.565806 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" containerName="kube-rbac-proxy-web" Apr 20 22:26:01.566446 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.566025 2568 scope.go:117] "RemoveContainer" containerID="09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" Apr 20 22:26:01.566446 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:26:01.566382 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": container with ID starting with 09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c not found: ID does not exist" containerID="09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" Apr 20 22:26:01.566446 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.566408 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c"} err="failed to get container status \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": rpc error: code = NotFound desc = could not find container \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": container with ID starting with 09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c not found: ID does not exist" Apr 20 22:26:01.566446 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.566428 2568 scope.go:117] "RemoveContainer" containerID="c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" Apr 20 22:26:01.566689 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:26:01.566674 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": container with ID starting with c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f not found: ID does not exist" containerID="c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" Apr 20 22:26:01.566735 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.566693 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f"} err="failed to get container status \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": rpc error: code = NotFound desc = could not find container \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": container with ID starting with c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f not found: ID does not exist" Apr 20 22:26:01.566735 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.566705 2568 scope.go:117] "RemoveContainer" containerID="b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" Apr 20 22:26:01.566929 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:26:01.566913 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": container with ID starting with b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b not found: ID does not exist" containerID="b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" Apr 20 22:26:01.566981 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.566932 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b"} err="failed to get container status \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": rpc error: code = NotFound desc = could not find container \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": container with ID starting with b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b not found: ID does not exist" Apr 20 22:26:01.566981 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.566958 2568 scope.go:117] "RemoveContainer" containerID="dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" Apr 20 22:26:01.567220 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:26:01.567189 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": container with ID starting with dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8 not found: ID does not exist" containerID="dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" Apr 20 22:26:01.567220 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567207 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8"} err="failed to get container status \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": rpc error: code = NotFound desc = could not find container \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": container with ID starting with dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8 not found: ID does not exist" Apr 20 22:26:01.567305 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567221 2568 scope.go:117] "RemoveContainer" containerID="24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" Apr 20 22:26:01.567398 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:26:01.567384 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": container with ID starting with 24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad not found: ID does not exist" containerID="24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" Apr 20 22:26:01.567436 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567399 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad"} err="failed to get container status \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": rpc error: code = NotFound desc = could not find container \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": container with ID starting with 24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad not found: ID does not exist" Apr 20 22:26:01.567436 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567409 2568 scope.go:117] "RemoveContainer" containerID="5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" Apr 20 22:26:01.567592 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:26:01.567576 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": container with ID starting with 5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98 not found: ID does not exist" containerID="5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" Apr 20 22:26:01.567634 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567593 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98"} err="failed to get container status \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": rpc error: code = NotFound desc = could not find container \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": container with ID starting with 5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98 not found: ID does not exist" Apr 20 22:26:01.567634 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567605 2568 scope.go:117] "RemoveContainer" containerID="24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18" Apr 20 22:26:01.567793 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:26:01.567778 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": container with ID starting with 24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18 not found: ID does not exist" containerID="24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18" Apr 20 22:26:01.567846 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567795 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18"} err="failed to get container status \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": rpc error: code = NotFound desc = could not find container \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": container with ID starting with 24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18 not found: ID does not exist" Apr 20 22:26:01.567846 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567807 2568 scope.go:117] "RemoveContainer" containerID="09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" Apr 20 22:26:01.567988 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567971 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c"} err="failed to get container status \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": rpc error: code = NotFound desc = could not find container \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": container with ID starting with 09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c not found: ID does not exist" Apr 20 22:26:01.568044 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.567988 2568 scope.go:117] "RemoveContainer" containerID="c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" Apr 20 22:26:01.568200 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568184 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f"} err="failed to get container status \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": rpc error: code = NotFound desc = could not find container \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": container with ID starting with c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f not found: ID does not exist" Apr 20 22:26:01.568240 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568200 2568 scope.go:117] "RemoveContainer" containerID="b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" Apr 20 22:26:01.568387 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568372 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b"} err="failed to get container status \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": rpc error: code = NotFound desc = could not find container \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": container with ID starting with b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b not found: ID does not exist" Apr 20 22:26:01.568427 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568387 2568 scope.go:117] "RemoveContainer" containerID="dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" Apr 20 22:26:01.568562 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568547 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8"} err="failed to get container status \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": rpc error: code = NotFound desc = could not find container \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": container with ID starting with dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8 not found: ID does not exist" Apr 20 22:26:01.568603 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568562 2568 scope.go:117] "RemoveContainer" containerID="24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" Apr 20 22:26:01.568720 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568706 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad"} err="failed to get container status \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": rpc error: code = NotFound desc = could not find container \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": container with ID starting with 24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad not found: ID does not exist" Apr 20 22:26:01.568768 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568720 2568 scope.go:117] "RemoveContainer" containerID="5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" Apr 20 22:26:01.568934 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568914 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98"} err="failed to get container status \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": rpc error: code = NotFound desc = could not find container \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": container with ID starting with 5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98 not found: ID does not exist" Apr 20 22:26:01.568934 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.568934 2568 scope.go:117] "RemoveContainer" containerID="24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18" Apr 20 22:26:01.569111 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569095 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18"} err="failed to get container status \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": rpc error: code = NotFound desc = could not find container \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": container with ID starting with 24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18 not found: ID does not exist" Apr 20 22:26:01.569174 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569111 2568 scope.go:117] "RemoveContainer" containerID="09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" Apr 20 22:26:01.569293 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569278 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c"} err="failed to get container status \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": rpc error: code = NotFound desc = could not find container \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": container with ID starting with 09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c not found: ID does not exist" Apr 20 22:26:01.569330 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569293 2568 scope.go:117] "RemoveContainer" containerID="c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" Apr 20 22:26:01.569463 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569448 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f"} err="failed to get container status \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": rpc error: code = NotFound desc = could not find container \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": container with ID starting with c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f not found: ID does not exist" Apr 20 22:26:01.569463 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569462 2568 scope.go:117] "RemoveContainer" containerID="b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" Apr 20 22:26:01.569642 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569627 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b"} err="failed to get container status \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": rpc error: code = NotFound desc = could not find container \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": container with ID starting with b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b not found: ID does not exist" Apr 20 22:26:01.569680 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569641 2568 scope.go:117] "RemoveContainer" containerID="dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" Apr 20 22:26:01.569802 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569784 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8"} err="failed to get container status \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": rpc error: code = NotFound desc = could not find container \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": container with ID starting with dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8 not found: ID does not exist" Apr 20 22:26:01.569843 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.569803 2568 scope.go:117] "RemoveContainer" containerID="24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" Apr 20 22:26:01.570021 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.570005 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.570114 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.570002 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad"} err="failed to get container status \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": rpc error: code = NotFound desc = could not find container \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": container with ID starting with 24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad not found: ID does not exist" Apr 20 22:26:01.570114 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.570111 2568 scope.go:117] "RemoveContainer" containerID="5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" Apr 20 22:26:01.570593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.570568 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98"} err="failed to get container status \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": rpc error: code = NotFound desc = could not find container \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": container with ID starting with 5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98 not found: ID does not exist" Apr 20 22:26:01.570593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.570592 2568 scope.go:117] "RemoveContainer" containerID="24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18" Apr 20 22:26:01.570834 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.570814 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18"} err="failed to get container status \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": rpc error: code = NotFound desc = could not find container \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": container with ID starting with 24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18 not found: ID does not exist" Apr 20 22:26:01.571037 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.570834 2568 scope.go:117] "RemoveContainer" containerID="09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" Apr 20 22:26:01.571136 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.571119 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c"} err="failed to get container status \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": rpc error: code = NotFound desc = could not find container \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": container with ID starting with 09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c not found: ID does not exist" Apr 20 22:26:01.571204 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.571136 2568 scope.go:117] "RemoveContainer" containerID="c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" Apr 20 22:26:01.571400 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.571380 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f"} err="failed to get container status \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": rpc error: code = NotFound desc = could not find container \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": container with ID starting with c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f not found: ID does not exist" Apr 20 22:26:01.571400 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.571399 2568 scope.go:117] "RemoveContainer" containerID="b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" Apr 20 22:26:01.571595 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.571579 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b"} err="failed to get container status \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": rpc error: code = NotFound desc = could not find container \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": container with ID starting with b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b not found: ID does not exist" Apr 20 22:26:01.571595 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.571595 2568 scope.go:117] "RemoveContainer" containerID="dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" Apr 20 22:26:01.571832 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.571814 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8"} err="failed to get container status \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": rpc error: code = NotFound desc = could not find container \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": container with ID starting with dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8 not found: ID does not exist" Apr 20 22:26:01.571832 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.571831 2568 scope.go:117] "RemoveContainer" containerID="24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" Apr 20 22:26:01.572025 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572003 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad"} err="failed to get container status \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": rpc error: code = NotFound desc = could not find container \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": container with ID starting with 24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad not found: ID does not exist" Apr 20 22:26:01.572099 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572025 2568 scope.go:117] "RemoveContainer" containerID="5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" Apr 20 22:26:01.572405 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572385 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98"} err="failed to get container status \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": rpc error: code = NotFound desc = could not find container \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": container with ID starting with 5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98 not found: ID does not exist" Apr 20 22:26:01.572475 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572404 2568 scope.go:117] "RemoveContainer" containerID="24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18" Apr 20 22:26:01.572523 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572500 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 22:26:01.572875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572640 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 22:26:01.572875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572640 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18"} err="failed to get container status \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": rpc error: code = NotFound desc = could not find container \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": container with ID starting with 24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18 not found: ID does not exist" Apr 20 22:26:01.572875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572663 2568 scope.go:117] "RemoveContainer" containerID="09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" Apr 20 22:26:01.572875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572689 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 22:26:01.572875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572733 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 22:26:01.572875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572742 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 22:26:01.572875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572832 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 22:26:01.572875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.572846 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 22:26:01.573326 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573050 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c"} err="failed to get container status \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": rpc error: code = NotFound desc = could not find container \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": container with ID starting with 09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c not found: ID does not exist" Apr 20 22:26:01.573326 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573073 2568 scope.go:117] "RemoveContainer" containerID="c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" Apr 20 22:26:01.573326 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573134 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 22:26:01.573326 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573205 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 22:26:01.573326 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573248 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bgi9927i5vdvg\"" Apr 20 22:26:01.573531 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573403 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f"} err="failed to get container status \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": rpc error: code = NotFound desc = could not find container \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": container with ID starting with c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f not found: ID does not exist" Apr 20 22:26:01.573531 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573426 2568 scope.go:117] "RemoveContainer" containerID="b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" Apr 20 22:26:01.573531 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573444 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-n8wr7\"" Apr 20 22:26:01.573722 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573691 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b"} err="failed to get container status \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": rpc error: code = NotFound desc = could not find container \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": container with ID starting with b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b not found: ID does not exist" Apr 20 22:26:01.573766 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573725 2568 scope.go:117] "RemoveContainer" containerID="dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" Apr 20 22:26:01.573889 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573859 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 22:26:01.573997 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573970 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8"} err="failed to get container status \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": rpc error: code = NotFound desc = could not find container \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": container with ID starting with dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8 not found: ID does not exist" Apr 20 22:26:01.574054 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.573999 2568 scope.go:117] "RemoveContainer" containerID="24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" Apr 20 22:26:01.574320 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.574293 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad"} err="failed to get container status \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": rpc error: code = NotFound desc = could not find container \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": container with ID starting with 24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad not found: ID does not exist" Apr 20 22:26:01.574430 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.574320 2568 scope.go:117] "RemoveContainer" containerID="5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" Apr 20 22:26:01.574689 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.574667 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98"} err="failed to get container status \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": rpc error: code = NotFound desc = could not find container \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": container with ID starting with 5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98 not found: ID does not exist" Apr 20 22:26:01.574785 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.574690 2568 scope.go:117] "RemoveContainer" containerID="24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18" Apr 20 22:26:01.574988 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.574962 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18"} err="failed to get container status \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": rpc error: code = NotFound desc = could not find container \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": container with ID starting with 24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18 not found: ID does not exist" Apr 20 22:26:01.575078 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.574989 2568 scope.go:117] "RemoveContainer" containerID="09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c" Apr 20 22:26:01.575266 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.575243 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c"} err="failed to get container status \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": rpc error: code = NotFound desc = could not find container \"09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c\": container with ID starting with 09c20c699fe0e0924a64774e451dede415572bb6e0ad04b57388219cb8fa3c0c not found: ID does not exist" Apr 20 22:26:01.575371 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.575268 2568 scope.go:117] "RemoveContainer" containerID="c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f" Apr 20 22:26:01.575540 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.575513 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f"} err="failed to get container status \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": rpc error: code = NotFound desc = could not find container \"c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f\": container with ID starting with c3c21c65c91b3e95517bc6e90d59b3f0d9f133acc0b64faa417825f2ded00e2f not found: ID does not exist" Apr 20 22:26:01.575619 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.575541 2568 scope.go:117] "RemoveContainer" containerID="b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b" Apr 20 22:26:01.575859 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.575832 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b"} err="failed to get container status \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": rpc error: code = NotFound desc = could not find container \"b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b\": container with ID starting with b0dfc9581ea94ff937e46dae76298472f9164fc0c2e4efa272a01e508a870a6b not found: ID does not exist" Apr 20 22:26:01.575969 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.575861 2568 scope.go:117] "RemoveContainer" containerID="dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8" Apr 20 22:26:01.576049 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.576033 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 22:26:01.576199 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.576179 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8"} err="failed to get container status \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": rpc error: code = NotFound desc = could not find container \"dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8\": container with ID starting with dcce922d3924a30b37d2a511395b6ee790c1dc789f122d2248901586d771c6a8 not found: ID does not exist" Apr 20 22:26:01.576277 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.576201 2568 scope.go:117] "RemoveContainer" containerID="24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad" Apr 20 22:26:01.576481 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.576439 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad"} err="failed to get container status \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": rpc error: code = NotFound desc = could not find container \"24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad\": container with ID starting with 24e3c324524bcaff84301330d10f3d0dbafb0d57ee308c4fa33e88011ffe01ad not found: ID does not exist" Apr 20 22:26:01.576481 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.576471 2568 scope.go:117] "RemoveContainer" containerID="5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98" Apr 20 22:26:01.576714 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.576694 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98"} err="failed to get container status \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": rpc error: code = NotFound desc = could not find container \"5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98\": container with ID starting with 5eea5081aca138b7065af2e4899e09a7da46d7c6b1da2bc25c219fbbaa9bfa98 not found: ID does not exist" Apr 20 22:26:01.576788 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.576713 2568 scope.go:117] "RemoveContainer" containerID="24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18" Apr 20 22:26:01.577225 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.577112 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18"} err="failed to get container status \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": rpc error: code = NotFound desc = could not find container \"24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18\": container with ID starting with 24022de2aadbf6ef2ec5262e48f72b3ecda393d23a4384671f67491417555e18 not found: ID does not exist" Apr 20 22:26:01.578280 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.578255 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 22:26:01.581536 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.581512 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 22:26:01.738418 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-config\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738418 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3dd81110-a0ea-4ced-9e4c-9c8f87002448-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738621 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738442 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-web-config\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738621 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738461 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738621 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738507 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt2jh\" (UniqueName: \"kubernetes.io/projected/3dd81110-a0ea-4ced-9e4c-9c8f87002448-kube-api-access-qt2jh\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738621 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738621 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738564 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738621 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738615 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738799 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738799 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738652 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738799 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738679 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dd81110-a0ea-4ced-9e4c-9c8f87002448-config-out\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738799 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dd81110-a0ea-4ced-9e4c-9c8f87002448-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738799 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738759 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738799 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738776 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.738799 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.739045 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738806 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.739045 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738836 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.739045 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.738856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.839941 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.839904 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.839941 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.839939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt2jh\" (UniqueName: \"kubernetes.io/projected/3dd81110-a0ea-4ced-9e4c-9c8f87002448-kube-api-access-qt2jh\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.840141 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.839962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.840141 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.839979 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.840141 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.840017 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.840141 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.840036 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.840439 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.840052 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.841118 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841053 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.841249 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841141 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.841249 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841219 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841464 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dd81110-a0ea-4ced-9e4c-9c8f87002448-config-out\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841533 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dd81110-a0ea-4ced-9e4c-9c8f87002448-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841566 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841667 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841729 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841804 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-config\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3dd81110-a0ea-4ced-9e4c-9c8f87002448-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.842258 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.841864 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-web-config\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.843684 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.843648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.843999 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.843975 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3dd81110-a0ea-4ced-9e4c-9c8f87002448-config-out\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.844062 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.844012 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.844984 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.844613 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.844984 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.844755 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.845126 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.845069 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3dd81110-a0ea-4ced-9e4c-9c8f87002448-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.845212 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.845189 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-web-config\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.845623 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.845291 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3dd81110-a0ea-4ced-9e4c-9c8f87002448-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.847839 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.846482 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.847839 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.846742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-config\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.847839 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.847443 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.847839 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.847766 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.848087 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.847867 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3dd81110-a0ea-4ced-9e4c-9c8f87002448-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.848789 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.848762 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3dd81110-a0ea-4ced-9e4c-9c8f87002448-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.850514 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.850489 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt2jh\" (UniqueName: \"kubernetes.io/projected/3dd81110-a0ea-4ced-9e4c-9c8f87002448-kube-api-access-qt2jh\") pod \"prometheus-k8s-0\" (UID: \"3dd81110-a0ea-4ced-9e4c-9c8f87002448\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:01.881625 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:01.881570 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:02.013825 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:02.013550 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 22:26:02.016260 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:26:02.016222 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd81110_a0ea_4ced_9e4c_9c8f87002448.slice/crio-f50434d751d9ae375814518534ed9699853e586babe3ab356c0512eb98e359a3 WatchSource:0}: Error finding container f50434d751d9ae375814518534ed9699853e586babe3ab356c0512eb98e359a3: Status 404 returned error can't find the container with id f50434d751d9ae375814518534ed9699853e586babe3ab356c0512eb98e359a3 Apr 20 22:26:02.516484 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:02.516450 2568 generic.go:358] "Generic (PLEG): container finished" podID="3dd81110-a0ea-4ced-9e4c-9c8f87002448" containerID="f9fb4565ee4ca0b9c2364d7b96f291030cd0e24e8f1b0f49625d4ca92607f47e" exitCode=0 Apr 20 22:26:02.516659 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:02.516543 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dd81110-a0ea-4ced-9e4c-9c8f87002448","Type":"ContainerDied","Data":"f9fb4565ee4ca0b9c2364d7b96f291030cd0e24e8f1b0f49625d4ca92607f47e"} Apr 20 22:26:02.516659 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:02.516577 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dd81110-a0ea-4ced-9e4c-9c8f87002448","Type":"ContainerStarted","Data":"f50434d751d9ae375814518534ed9699853e586babe3ab356c0512eb98e359a3"} Apr 20 22:26:02.913562 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:02.913528 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce82e00-8ebc-4c6f-8cd4-172874b459cb" path="/var/lib/kubelet/pods/1ce82e00-8ebc-4c6f-8cd4-172874b459cb/volumes" Apr 20 22:26:03.523266 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:03.523228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dd81110-a0ea-4ced-9e4c-9c8f87002448","Type":"ContainerStarted","Data":"d5c9c3afef086f675049829b27a1b0101ac64ca94441b4c001dc1ac2de64a9ca"} Apr 20 22:26:03.523266 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:03.523269 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dd81110-a0ea-4ced-9e4c-9c8f87002448","Type":"ContainerStarted","Data":"47a80993509842681f7ce371678705618d661fde10c11e2ef52f80dad06e6c98"} Apr 20 22:26:03.523642 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:03.523280 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dd81110-a0ea-4ced-9e4c-9c8f87002448","Type":"ContainerStarted","Data":"2655c1fdb327ce93dd0685209f88895f6789b955f17472540977faa983adcad7"} Apr 20 22:26:03.523642 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:03.523290 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dd81110-a0ea-4ced-9e4c-9c8f87002448","Type":"ContainerStarted","Data":"c8a0131bb1ab0ec0e0af99ae696e718dd322837070eed6a3affa72e598f03cdb"} Apr 20 22:26:03.523642 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:03.523298 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dd81110-a0ea-4ced-9e4c-9c8f87002448","Type":"ContainerStarted","Data":"7043876ef983e868c0da77ed3db6939baf51f915bd102e3dcb5b18cc2eb8e01f"} Apr 20 22:26:03.523642 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:03.523305 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3dd81110-a0ea-4ced-9e4c-9c8f87002448","Type":"ContainerStarted","Data":"866b23d8251e5eeeb984fbb35b6c1e13767c0d3705995d7b6b9857818a5fd10c"} Apr 20 22:26:03.549894 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:03.549824 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.5498006 podStartE2EDuration="2.5498006s" podCreationTimestamp="2026-04-20 22:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:26:03.548502623 +0000 UTC m=+91.221690343" watchObservedRunningTime="2026-04-20 22:26:03.5498006 +0000 UTC m=+91.222988286" Apr 20 22:26:06.882423 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:06.882366 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:26:08.530576 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:08.530544 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f2prt_bae61085-f01a-4979-8495-49df502b51b9/kube-state-metrics/0.log" Apr 20 22:26:08.717564 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:08.717532 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f2prt_bae61085-f01a-4979-8495-49df502b51b9/kube-rbac-proxy-main/0.log" Apr 20 22:26:08.918785 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:08.918709 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f2prt_bae61085-f01a-4979-8495-49df502b51b9/kube-rbac-proxy-self/0.log" Apr 20 22:26:09.118065 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:09.118031 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5c868c9cbc-68w4d_c2b4944e-42df-4cc7-a7fa-55aff7e04fbf/metrics-server/0.log" Apr 20 22:26:09.318436 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:09.318404 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-22wf7_e9da1b1c-c4fb-4597-9667-f377d36939d7/monitoring-plugin/0.log" Apr 20 22:26:09.518687 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:09.518657 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6jbt7_84a62217-01ac-4867-83c4-e5586c70021c/init-textfile/0.log" Apr 20 22:26:09.719013 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:09.718983 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6jbt7_84a62217-01ac-4867-83c4-e5586c70021c/node-exporter/0.log" Apr 20 22:26:09.918789 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:09.918756 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6jbt7_84a62217-01ac-4867-83c4-e5586c70021c/kube-rbac-proxy/0.log" Apr 20 22:26:11.318702 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:11.318671 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8kxkw_5ac8a623-4817-4b58-9c3f-57dce933db29/kube-rbac-proxy-main/0.log" Apr 20 22:26:11.518940 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:11.518913 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8kxkw_5ac8a623-4817-4b58-9c3f-57dce933db29/kube-rbac-proxy-self/0.log" Apr 20 22:26:11.718057 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:11.718034 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8kxkw_5ac8a623-4817-4b58-9c3f-57dce933db29/openshift-state-metrics/0.log" Apr 20 22:26:11.926703 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:11.926675 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/init-config-reloader/0.log" Apr 20 22:26:12.119416 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:12.119338 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/prometheus/0.log" Apr 20 22:26:12.318451 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:12.318423 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/config-reloader/0.log" Apr 20 22:26:12.518261 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:12.518237 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/thanos-sidecar/0.log" Apr 20 22:26:12.718586 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:12.718556 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/kube-rbac-proxy-web/0.log" Apr 20 22:26:12.918086 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:12.918012 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/kube-rbac-proxy/0.log" Apr 20 22:26:13.117638 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:13.117609 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/kube-rbac-proxy-thanos/0.log" Apr 20 22:26:13.456126 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:13.456083 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-shrcq" Apr 20 22:26:14.518573 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:14.518544 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/thanos-query/0.log" Apr 20 22:26:14.717975 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:14.717906 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/kube-rbac-proxy-web/0.log" Apr 20 22:26:14.918992 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:14.918914 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/kube-rbac-proxy/0.log" Apr 20 22:26:15.118222 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:15.118191 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/prom-label-proxy/0.log" Apr 20 22:26:15.318895 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:15.318863 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/kube-rbac-proxy-rules/0.log" Apr 20 22:26:15.518121 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:15.518093 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/kube-rbac-proxy-metrics/0.log" Apr 20 22:26:17.118112 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:26:17.118083 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5756b_c1e78fef-7128-47b2-a77d-46a98bb24af9/serve-healthcheck-canary/0.log" Apr 20 22:27:01.882805 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:01.882755 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:27:01.898692 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:01.898656 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:27:02.709990 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:02.709957 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 22:27:54.185893 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.185796 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-m6mz4"] Apr 20 22:27:54.188904 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.188885 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.191697 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.191674 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 22:27:54.197272 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.197245 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m6mz4"] Apr 20 22:27:54.286620 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.286579 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2be881fb-1970-4504-9584-45d4d886c5a9-original-pull-secret\") pod \"global-pull-secret-syncer-m6mz4\" (UID: \"2be881fb-1970-4504-9584-45d4d886c5a9\") " pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.286805 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.286659 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2be881fb-1970-4504-9584-45d4d886c5a9-kubelet-config\") pod \"global-pull-secret-syncer-m6mz4\" (UID: \"2be881fb-1970-4504-9584-45d4d886c5a9\") " pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.286805 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.286712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2be881fb-1970-4504-9584-45d4d886c5a9-dbus\") pod \"global-pull-secret-syncer-m6mz4\" (UID: \"2be881fb-1970-4504-9584-45d4d886c5a9\") " pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.388019 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.387982 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2be881fb-1970-4504-9584-45d4d886c5a9-original-pull-secret\") pod \"global-pull-secret-syncer-m6mz4\" (UID: \"2be881fb-1970-4504-9584-45d4d886c5a9\") " pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.388264 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.388084 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2be881fb-1970-4504-9584-45d4d886c5a9-kubelet-config\") pod \"global-pull-secret-syncer-m6mz4\" (UID: \"2be881fb-1970-4504-9584-45d4d886c5a9\") " pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.388264 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.388130 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2be881fb-1970-4504-9584-45d4d886c5a9-kubelet-config\") pod \"global-pull-secret-syncer-m6mz4\" (UID: \"2be881fb-1970-4504-9584-45d4d886c5a9\") " pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.388264 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.388178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2be881fb-1970-4504-9584-45d4d886c5a9-dbus\") pod \"global-pull-secret-syncer-m6mz4\" (UID: \"2be881fb-1970-4504-9584-45d4d886c5a9\") " pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.388387 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.388352 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2be881fb-1970-4504-9584-45d4d886c5a9-dbus\") pod \"global-pull-secret-syncer-m6mz4\" (UID: \"2be881fb-1970-4504-9584-45d4d886c5a9\") " pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.390369 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.390344 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2be881fb-1970-4504-9584-45d4d886c5a9-original-pull-secret\") pod \"global-pull-secret-syncer-m6mz4\" (UID: \"2be881fb-1970-4504-9584-45d4d886c5a9\") " pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.498297 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.498202 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m6mz4" Apr 20 22:27:54.620661 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.620626 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m6mz4"] Apr 20 22:27:54.624408 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:27:54.624375 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2be881fb_1970_4504_9584_45d4d886c5a9.slice/crio-ef81a6b0d14f0b139b3826a7defe67f4e454295f57dd15d5a7fcf65e3f426e92 WatchSource:0}: Error finding container ef81a6b0d14f0b139b3826a7defe67f4e454295f57dd15d5a7fcf65e3f426e92: Status 404 returned error can't find the container with id ef81a6b0d14f0b139b3826a7defe67f4e454295f57dd15d5a7fcf65e3f426e92 Apr 20 22:27:54.844272 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:54.844175 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m6mz4" event={"ID":"2be881fb-1970-4504-9584-45d4d886c5a9","Type":"ContainerStarted","Data":"ef81a6b0d14f0b139b3826a7defe67f4e454295f57dd15d5a7fcf65e3f426e92"} Apr 20 22:27:59.864915 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:59.864880 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m6mz4" event={"ID":"2be881fb-1970-4504-9584-45d4d886c5a9","Type":"ContainerStarted","Data":"486862b125beda29e2b5df7d70f10fe29c93730b66d65e764b5bb1bf3963c3d4"} Apr 20 22:27:59.881696 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:27:59.881645 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m6mz4" podStartSLOduration=1.287795214 podStartE2EDuration="5.881628012s" podCreationTimestamp="2026-04-20 22:27:54 +0000 UTC" firstStartedPulling="2026-04-20 22:27:54.626305893 +0000 UTC m=+202.299493561" lastFinishedPulling="2026-04-20 22:27:59.220138691 +0000 UTC m=+206.893326359" observedRunningTime="2026-04-20 22:27:59.879957876 +0000 UTC m=+207.553145563" watchObservedRunningTime="2026-04-20 22:27:59.881628012 +0000 UTC m=+207.554815711" Apr 20 22:29:07.541356 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.541279 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-9m9ll"] Apr 20 22:29:07.544424 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.544406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-9m9ll" Apr 20 22:29:07.546846 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.546823 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 22:29:07.547931 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.547911 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-gvkgk\"" Apr 20 22:29:07.548023 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.547972 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 22:29:07.554207 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.554176 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-9m9ll"] Apr 20 22:29:07.575006 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.574960 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b5c41ea-44a2-4bbe-84ef-3470897094e3-bound-sa-token\") pod \"cert-manager-79c8d999ff-9m9ll\" (UID: \"5b5c41ea-44a2-4bbe-84ef-3470897094e3\") " pod="cert-manager/cert-manager-79c8d999ff-9m9ll" Apr 20 22:29:07.575181 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.575085 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fwhj\" (UniqueName: \"kubernetes.io/projected/5b5c41ea-44a2-4bbe-84ef-3470897094e3-kube-api-access-7fwhj\") pod \"cert-manager-79c8d999ff-9m9ll\" (UID: \"5b5c41ea-44a2-4bbe-84ef-3470897094e3\") " pod="cert-manager/cert-manager-79c8d999ff-9m9ll" Apr 20 22:29:07.676347 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.676303 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fwhj\" (UniqueName: \"kubernetes.io/projected/5b5c41ea-44a2-4bbe-84ef-3470897094e3-kube-api-access-7fwhj\") pod \"cert-manager-79c8d999ff-9m9ll\" (UID: \"5b5c41ea-44a2-4bbe-84ef-3470897094e3\") " pod="cert-manager/cert-manager-79c8d999ff-9m9ll" Apr 20 22:29:07.676528 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.676365 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b5c41ea-44a2-4bbe-84ef-3470897094e3-bound-sa-token\") pod \"cert-manager-79c8d999ff-9m9ll\" (UID: \"5b5c41ea-44a2-4bbe-84ef-3470897094e3\") " pod="cert-manager/cert-manager-79c8d999ff-9m9ll" Apr 20 22:29:07.683841 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.683805 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b5c41ea-44a2-4bbe-84ef-3470897094e3-bound-sa-token\") pod \"cert-manager-79c8d999ff-9m9ll\" (UID: \"5b5c41ea-44a2-4bbe-84ef-3470897094e3\") " pod="cert-manager/cert-manager-79c8d999ff-9m9ll" Apr 20 22:29:07.683997 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.683873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fwhj\" (UniqueName: \"kubernetes.io/projected/5b5c41ea-44a2-4bbe-84ef-3470897094e3-kube-api-access-7fwhj\") pod \"cert-manager-79c8d999ff-9m9ll\" (UID: \"5b5c41ea-44a2-4bbe-84ef-3470897094e3\") " pod="cert-manager/cert-manager-79c8d999ff-9m9ll" Apr 20 22:29:07.862752 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.862667 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-9m9ll" Apr 20 22:29:07.995606 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:07.995573 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-9m9ll"] Apr 20 22:29:07.999002 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:29:07.998914 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b5c41ea_44a2_4bbe_84ef_3470897094e3.slice/crio-aca039215dbdbd3f3bdd8282dc6d819532a40a72a2f8394cf7675e45057b4eef WatchSource:0}: Error finding container aca039215dbdbd3f3bdd8282dc6d819532a40a72a2f8394cf7675e45057b4eef: Status 404 returned error can't find the container with id aca039215dbdbd3f3bdd8282dc6d819532a40a72a2f8394cf7675e45057b4eef Apr 20 22:29:08.048503 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:08.048460 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-9m9ll" event={"ID":"5b5c41ea-44a2-4bbe-84ef-3470897094e3","Type":"ContainerStarted","Data":"aca039215dbdbd3f3bdd8282dc6d819532a40a72a2f8394cf7675e45057b4eef"} Apr 20 22:29:08.943913 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:08.943867 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2"] Apr 20 22:29:08.947644 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:08.947614 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" Apr 20 22:29:08.950089 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:08.950062 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:29:08.951116 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:08.951086 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-frc98\"" Apr 20 22:29:08.951260 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:08.951170 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 22:29:08.956630 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:08.956574 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2"] Apr 20 22:29:08.988944 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:08.988868 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2851f99-0f37-44c3-98b1-9a45a63ddce0-tmp\") pod \"openshift-lws-operator-bfc7f696d-dzlm2\" (UID: \"b2851f99-0f37-44c3-98b1-9a45a63ddce0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" Apr 20 22:29:08.989144 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:08.988958 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkn9f\" (UniqueName: \"kubernetes.io/projected/b2851f99-0f37-44c3-98b1-9a45a63ddce0-kube-api-access-zkn9f\") pod \"openshift-lws-operator-bfc7f696d-dzlm2\" (UID: \"b2851f99-0f37-44c3-98b1-9a45a63ddce0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" Apr 20 22:29:09.089684 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:09.089638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2851f99-0f37-44c3-98b1-9a45a63ddce0-tmp\") pod \"openshift-lws-operator-bfc7f696d-dzlm2\" (UID: \"b2851f99-0f37-44c3-98b1-9a45a63ddce0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" Apr 20 22:29:09.089880 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:09.089729 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkn9f\" (UniqueName: \"kubernetes.io/projected/b2851f99-0f37-44c3-98b1-9a45a63ddce0-kube-api-access-zkn9f\") pod \"openshift-lws-operator-bfc7f696d-dzlm2\" (UID: \"b2851f99-0f37-44c3-98b1-9a45a63ddce0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" Apr 20 22:29:09.090328 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:09.090276 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2851f99-0f37-44c3-98b1-9a45a63ddce0-tmp\") pod \"openshift-lws-operator-bfc7f696d-dzlm2\" (UID: \"b2851f99-0f37-44c3-98b1-9a45a63ddce0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" Apr 20 22:29:09.098533 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:09.098484 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkn9f\" (UniqueName: \"kubernetes.io/projected/b2851f99-0f37-44c3-98b1-9a45a63ddce0-kube-api-access-zkn9f\") pod \"openshift-lws-operator-bfc7f696d-dzlm2\" (UID: \"b2851f99-0f37-44c3-98b1-9a45a63ddce0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" Apr 20 22:29:09.261922 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:09.261825 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" Apr 20 22:29:09.417324 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:09.417255 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2"] Apr 20 22:29:10.919668 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:29:10.919635 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2851f99_0f37_44c3_98b1_9a45a63ddce0.slice/crio-7083e2b5c467c4c8a38736629e9d0b67bcbd434880cd7df4328cd2ff7daefabc WatchSource:0}: Error finding container 7083e2b5c467c4c8a38736629e9d0b67bcbd434880cd7df4328cd2ff7daefabc: Status 404 returned error can't find the container with id 7083e2b5c467c4c8a38736629e9d0b67bcbd434880cd7df4328cd2ff7daefabc Apr 20 22:29:11.060077 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:11.060043 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-9m9ll" event={"ID":"5b5c41ea-44a2-4bbe-84ef-3470897094e3","Type":"ContainerStarted","Data":"30cb83846d9099807f9552e2afa5391022592a81ce5f763d91805586b18301d0"} Apr 20 22:29:11.061206 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:11.061176 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" event={"ID":"b2851f99-0f37-44c3-98b1-9a45a63ddce0","Type":"ContainerStarted","Data":"7083e2b5c467c4c8a38736629e9d0b67bcbd434880cd7df4328cd2ff7daefabc"} Apr 20 22:29:11.078433 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:11.078357 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-9m9ll" podStartSLOduration=1.091909282 podStartE2EDuration="4.078336676s" podCreationTimestamp="2026-04-20 22:29:07 +0000 UTC" firstStartedPulling="2026-04-20 22:29:08.001379076 +0000 UTC m=+275.674566757" lastFinishedPulling="2026-04-20 22:29:10.987806488 +0000 UTC m=+278.660994151" observedRunningTime="2026-04-20 22:29:11.077740771 +0000 UTC m=+278.750928467" watchObservedRunningTime="2026-04-20 22:29:11.078336676 +0000 UTC m=+278.751524362" Apr 20 22:29:14.072770 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:14.072732 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" event={"ID":"b2851f99-0f37-44c3-98b1-9a45a63ddce0","Type":"ContainerStarted","Data":"4e5d6c9e021deecdad908af4a9edcab5733e539df7c9d32e3ce8945fae519708"} Apr 20 22:29:14.093024 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:14.092977 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-dzlm2" podStartSLOduration=3.735713289 podStartE2EDuration="6.092958912s" podCreationTimestamp="2026-04-20 22:29:08 +0000 UTC" firstStartedPulling="2026-04-20 22:29:10.921545862 +0000 UTC m=+278.594733528" lastFinishedPulling="2026-04-20 22:29:13.278791485 +0000 UTC m=+280.951979151" observedRunningTime="2026-04-20 22:29:14.092877825 +0000 UTC m=+281.766065513" watchObservedRunningTime="2026-04-20 22:29:14.092958912 +0000 UTC m=+281.766146596" Apr 20 22:29:30.207249 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.207215 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx"] Apr 20 22:29:30.210446 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.210426 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.213416 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.213392 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 22:29:30.213554 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.213527 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-q26g9\"" Apr 20 22:29:30.213626 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.213459 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 22:29:30.213626 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.213463 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 22:29:30.213731 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.213456 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 22:29:30.226756 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.226726 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx"] Apr 20 22:29:30.386805 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.386766 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/de45cef1-0c87-4ce3-958b-7bc29edea051-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-t9tvx\" (UID: \"de45cef1-0c87-4ce3-958b-7bc29edea051\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.387007 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.386822 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/de45cef1-0c87-4ce3-958b-7bc29edea051-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-t9tvx\" (UID: \"de45cef1-0c87-4ce3-958b-7bc29edea051\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.387007 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.386876 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxw77\" (UniqueName: \"kubernetes.io/projected/de45cef1-0c87-4ce3-958b-7bc29edea051-kube-api-access-wxw77\") pod \"opendatahub-operator-controller-manager-5d8d569d47-t9tvx\" (UID: \"de45cef1-0c87-4ce3-958b-7bc29edea051\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.487801 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.487712 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxw77\" (UniqueName: \"kubernetes.io/projected/de45cef1-0c87-4ce3-958b-7bc29edea051-kube-api-access-wxw77\") pod \"opendatahub-operator-controller-manager-5d8d569d47-t9tvx\" (UID: \"de45cef1-0c87-4ce3-958b-7bc29edea051\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.487801 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.487799 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/de45cef1-0c87-4ce3-958b-7bc29edea051-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-t9tvx\" (UID: \"de45cef1-0c87-4ce3-958b-7bc29edea051\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.487988 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.487837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/de45cef1-0c87-4ce3-958b-7bc29edea051-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-t9tvx\" (UID: \"de45cef1-0c87-4ce3-958b-7bc29edea051\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.490404 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.490377 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/de45cef1-0c87-4ce3-958b-7bc29edea051-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-t9tvx\" (UID: \"de45cef1-0c87-4ce3-958b-7bc29edea051\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.490527 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.490409 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/de45cef1-0c87-4ce3-958b-7bc29edea051-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-t9tvx\" (UID: \"de45cef1-0c87-4ce3-958b-7bc29edea051\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.500801 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.500767 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxw77\" (UniqueName: \"kubernetes.io/projected/de45cef1-0c87-4ce3-958b-7bc29edea051-kube-api-access-wxw77\") pod \"opendatahub-operator-controller-manager-5d8d569d47-t9tvx\" (UID: \"de45cef1-0c87-4ce3-958b-7bc29edea051\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.521216 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.521181 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:30.679817 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.679787 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx"] Apr 20 22:29:30.682827 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:29:30.682796 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde45cef1_0c87_4ce3_958b_7bc29edea051.slice/crio-f5918c415cd56fbb1cae72ba8ecd9f990d342245c5af774bf8a5ca03a1b319a7 WatchSource:0}: Error finding container f5918c415cd56fbb1cae72ba8ecd9f990d342245c5af774bf8a5ca03a1b319a7: Status 404 returned error can't find the container with id f5918c415cd56fbb1cae72ba8ecd9f990d342245c5af774bf8a5ca03a1b319a7 Apr 20 22:29:30.851809 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.851724 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-845776cd66-lznfq"] Apr 20 22:29:30.856535 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.856503 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:30.859241 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.859211 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 22:29:30.859421 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.859250 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-vdml4\"" Apr 20 22:29:30.859421 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.859278 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 22:29:30.859552 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.859537 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 22:29:30.868276 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.868213 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-845776cd66-lznfq"] Apr 20 22:29:30.993339 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.993300 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967bcc7c-130b-413e-af49-f4650bda8ca6-cert\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:30.993526 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.993347 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hwsw\" (UniqueName: \"kubernetes.io/projected/967bcc7c-130b-413e-af49-f4650bda8ca6-kube-api-access-4hwsw\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:30.993526 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.993374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/967bcc7c-130b-413e-af49-f4650bda8ca6-metrics-cert\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:30.993526 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:30.993431 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/967bcc7c-130b-413e-af49-f4650bda8ca6-manager-config\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.094095 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.094050 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967bcc7c-130b-413e-af49-f4650bda8ca6-cert\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.094326 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.094108 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hwsw\" (UniqueName: \"kubernetes.io/projected/967bcc7c-130b-413e-af49-f4650bda8ca6-kube-api-access-4hwsw\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.094326 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.094133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/967bcc7c-130b-413e-af49-f4650bda8ca6-metrics-cert\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.094326 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.094205 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/967bcc7c-130b-413e-af49-f4650bda8ca6-manager-config\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.094952 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.094927 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/967bcc7c-130b-413e-af49-f4650bda8ca6-manager-config\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.096730 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.096707 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/967bcc7c-130b-413e-af49-f4650bda8ca6-metrics-cert\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.096841 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.096775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/967bcc7c-130b-413e-af49-f4650bda8ca6-cert\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.102660 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.102587 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hwsw\" (UniqueName: \"kubernetes.io/projected/967bcc7c-130b-413e-af49-f4650bda8ca6-kube-api-access-4hwsw\") pod \"lws-controller-manager-845776cd66-lznfq\" (UID: \"967bcc7c-130b-413e-af49-f4650bda8ca6\") " pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.125226 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.125179 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" event={"ID":"de45cef1-0c87-4ce3-958b-7bc29edea051","Type":"ContainerStarted","Data":"f5918c415cd56fbb1cae72ba8ecd9f990d342245c5af774bf8a5ca03a1b319a7"} Apr 20 22:29:31.166510 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.166469 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:31.313048 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:31.312660 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-845776cd66-lznfq"] Apr 20 22:29:31.315514 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:29:31.315479 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod967bcc7c_130b_413e_af49_f4650bda8ca6.slice/crio-d87af672ee7138c419519725fdaadcbebd4a5f93e6705919eb82d5a36cc60db5 WatchSource:0}: Error finding container d87af672ee7138c419519725fdaadcbebd4a5f93e6705919eb82d5a36cc60db5: Status 404 returned error can't find the container with id d87af672ee7138c419519725fdaadcbebd4a5f93e6705919eb82d5a36cc60db5 Apr 20 22:29:32.130966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:32.130916 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" event={"ID":"967bcc7c-130b-413e-af49-f4650bda8ca6","Type":"ContainerStarted","Data":"d87af672ee7138c419519725fdaadcbebd4a5f93e6705919eb82d5a36cc60db5"} Apr 20 22:29:33.253338 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:33.253313 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 22:29:34.140927 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:34.140890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" event={"ID":"de45cef1-0c87-4ce3-958b-7bc29edea051","Type":"ContainerStarted","Data":"1cb995274a0648c19a690eec9702f6fd55b9dbd5975a112979a28e94351985b2"} Apr 20 22:29:34.141087 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:34.141074 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:34.161024 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:34.160977 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" podStartSLOduration=1.566396159 podStartE2EDuration="4.160956824s" podCreationTimestamp="2026-04-20 22:29:30 +0000 UTC" firstStartedPulling="2026-04-20 22:29:30.684601675 +0000 UTC m=+298.357789341" lastFinishedPulling="2026-04-20 22:29:33.279162321 +0000 UTC m=+300.952350006" observedRunningTime="2026-04-20 22:29:34.160273578 +0000 UTC m=+301.833461264" watchObservedRunningTime="2026-04-20 22:29:34.160956824 +0000 UTC m=+301.834144510" Apr 20 22:29:35.145439 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:35.145397 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" event={"ID":"967bcc7c-130b-413e-af49-f4650bda8ca6","Type":"ContainerStarted","Data":"724eac9fffe974599cccdb16066b24b1ead0ec503067a23064c83c2f19a0e693"} Apr 20 22:29:35.145846 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:35.145715 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:35.172507 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:35.172450 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" podStartSLOduration=2.392761468 podStartE2EDuration="5.17243464s" podCreationTimestamp="2026-04-20 22:29:30 +0000 UTC" firstStartedPulling="2026-04-20 22:29:31.318114865 +0000 UTC m=+298.991302532" lastFinishedPulling="2026-04-20 22:29:34.097788041 +0000 UTC m=+301.770975704" observedRunningTime="2026-04-20 22:29:35.171428613 +0000 UTC m=+302.844616300" watchObservedRunningTime="2026-04-20 22:29:35.17243464 +0000 UTC m=+302.845622326" Apr 20 22:29:45.147603 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:45.147570 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-t9tvx" Apr 20 22:29:46.151330 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:46.151299 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-845776cd66-lznfq" Apr 20 22:29:49.098362 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.098323 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk"] Apr 20 22:29:49.108642 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.108611 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.110579 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.110528 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk"] Apr 20 22:29:49.111680 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.111508 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 22:29:49.112853 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.112816 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 22:29:49.112999 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.112894 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 22:29:49.112999 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.112910 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-4xpcs\"" Apr 20 22:29:49.113175 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.113138 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 22:29:49.157889 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.157852 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdtzm\" (UniqueName: \"kubernetes.io/projected/2fd070d1-dbe6-4220-a085-8debaefaff0d-kube-api-access-gdtzm\") pod \"kube-auth-proxy-7755c94fdf-gtfmk\" (UID: \"2fd070d1-dbe6-4220-a085-8debaefaff0d\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.158069 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.157917 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2fd070d1-dbe6-4220-a085-8debaefaff0d-tmp\") pod \"kube-auth-proxy-7755c94fdf-gtfmk\" (UID: \"2fd070d1-dbe6-4220-a085-8debaefaff0d\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.158069 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.157950 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd070d1-dbe6-4220-a085-8debaefaff0d-tls-certs\") pod \"kube-auth-proxy-7755c94fdf-gtfmk\" (UID: \"2fd070d1-dbe6-4220-a085-8debaefaff0d\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.258740 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.258695 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd070d1-dbe6-4220-a085-8debaefaff0d-tls-certs\") pod \"kube-auth-proxy-7755c94fdf-gtfmk\" (UID: \"2fd070d1-dbe6-4220-a085-8debaefaff0d\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.258957 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.258787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdtzm\" (UniqueName: \"kubernetes.io/projected/2fd070d1-dbe6-4220-a085-8debaefaff0d-kube-api-access-gdtzm\") pod \"kube-auth-proxy-7755c94fdf-gtfmk\" (UID: \"2fd070d1-dbe6-4220-a085-8debaefaff0d\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.258957 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.258857 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2fd070d1-dbe6-4220-a085-8debaefaff0d-tmp\") pod \"kube-auth-proxy-7755c94fdf-gtfmk\" (UID: \"2fd070d1-dbe6-4220-a085-8debaefaff0d\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.261090 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.261060 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2fd070d1-dbe6-4220-a085-8debaefaff0d-tmp\") pod \"kube-auth-proxy-7755c94fdf-gtfmk\" (UID: \"2fd070d1-dbe6-4220-a085-8debaefaff0d\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.261242 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.261202 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd070d1-dbe6-4220-a085-8debaefaff0d-tls-certs\") pod \"kube-auth-proxy-7755c94fdf-gtfmk\" (UID: \"2fd070d1-dbe6-4220-a085-8debaefaff0d\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.268805 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.268777 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdtzm\" (UniqueName: \"kubernetes.io/projected/2fd070d1-dbe6-4220-a085-8debaefaff0d-kube-api-access-gdtzm\") pod \"kube-auth-proxy-7755c94fdf-gtfmk\" (UID: \"2fd070d1-dbe6-4220-a085-8debaefaff0d\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.419287 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.419188 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" Apr 20 22:29:49.545464 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.545431 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk"] Apr 20 22:29:49.548399 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:29:49.548353 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fd070d1_dbe6_4220_a085_8debaefaff0d.slice/crio-c211c79fad58acfb62a82e578e6f17bba59671132c364106f6a3e2b6e4501e75 WatchSource:0}: Error finding container c211c79fad58acfb62a82e578e6f17bba59671132c364106f6a3e2b6e4501e75: Status 404 returned error can't find the container with id c211c79fad58acfb62a82e578e6f17bba59671132c364106f6a3e2b6e4501e75 Apr 20 22:29:49.550039 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:49.550021 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:29:50.194644 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:50.194593 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" event={"ID":"2fd070d1-dbe6-4220-a085-8debaefaff0d","Type":"ContainerStarted","Data":"c211c79fad58acfb62a82e578e6f17bba59671132c364106f6a3e2b6e4501e75"} Apr 20 22:29:53.206905 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:53.206864 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" event={"ID":"2fd070d1-dbe6-4220-a085-8debaefaff0d","Type":"ContainerStarted","Data":"33d8f26068b623249d718b7bf1771ceb4e0f4c4cf0a8bfb434895d5a244f4c75"} Apr 20 22:29:53.223344 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:29:53.223290 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-gtfmk" podStartSLOduration=0.956658939 podStartE2EDuration="4.223272102s" podCreationTimestamp="2026-04-20 22:29:49 +0000 UTC" firstStartedPulling="2026-04-20 22:29:49.550164706 +0000 UTC m=+317.223352371" lastFinishedPulling="2026-04-20 22:29:52.816777866 +0000 UTC m=+320.489965534" observedRunningTime="2026-04-20 22:29:53.221897822 +0000 UTC m=+320.895085509" watchObservedRunningTime="2026-04-20 22:29:53.223272102 +0000 UTC m=+320.896459881" Apr 20 22:31:10.649117 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.649075 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b4468b674-s7ckc"] Apr 20 22:31:10.652027 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.652006 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.671418 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.671376 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 22:31:10.671418 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.671376 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-dn9dg\"" Apr 20 22:31:10.671654 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.671493 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 22:31:10.671654 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.671528 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 22:31:10.671654 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.671535 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 22:31:10.671654 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.671542 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 22:31:10.673909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.673887 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 22:31:10.684519 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.684478 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 22:31:10.686912 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.686888 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b4468b674-s7ckc"] Apr 20 22:31:10.694352 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.694328 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 22:31:10.758946 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.758911 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ea13891a-9f6f-4158-b1c2-afc15b0fb327-console-oauth-config\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.758946 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.758951 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-oauth-serving-cert\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.759174 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.758976 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-service-ca\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.759174 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.759054 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-console-config\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.759174 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.759092 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea13891a-9f6f-4158-b1c2-afc15b0fb327-console-serving-cert\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.759174 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.759112 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4w4t\" (UniqueName: \"kubernetes.io/projected/ea13891a-9f6f-4158-b1c2-afc15b0fb327-kube-api-access-z4w4t\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.759174 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.759134 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-trusted-ca-bundle\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.859891 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.859851 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ea13891a-9f6f-4158-b1c2-afc15b0fb327-console-oauth-config\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.859891 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.859894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-oauth-serving-cert\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.860142 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.859919 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-service-ca\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.860142 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.859972 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-console-config\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.860142 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.860011 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea13891a-9f6f-4158-b1c2-afc15b0fb327-console-serving-cert\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.860142 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.860038 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4w4t\" (UniqueName: \"kubernetes.io/projected/ea13891a-9f6f-4158-b1c2-afc15b0fb327-kube-api-access-z4w4t\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.860142 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.860072 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-trusted-ca-bundle\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.860766 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.860742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-console-config\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.860873 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.860779 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-oauth-serving-cert\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.860873 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.860869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-trusted-ca-bundle\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.860980 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.860957 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea13891a-9f6f-4158-b1c2-afc15b0fb327-service-ca\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.862586 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.862567 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ea13891a-9f6f-4158-b1c2-afc15b0fb327-console-serving-cert\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.862674 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.862594 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ea13891a-9f6f-4158-b1c2-afc15b0fb327-console-oauth-config\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.870947 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.870926 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4w4t\" (UniqueName: \"kubernetes.io/projected/ea13891a-9f6f-4158-b1c2-afc15b0fb327-kube-api-access-z4w4t\") pod \"console-7b4468b674-s7ckc\" (UID: \"ea13891a-9f6f-4158-b1c2-afc15b0fb327\") " pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:10.960865 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:10.960833 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:11.100781 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:11.100748 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b4468b674-s7ckc"] Apr 20 22:31:11.104073 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:31:11.104042 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea13891a_9f6f_4158_b1c2_afc15b0fb327.slice/crio-c3a1fce1c1a253ec6b10a74cee38968100ed5feb88e4b62597e165d261978564 WatchSource:0}: Error finding container c3a1fce1c1a253ec6b10a74cee38968100ed5feb88e4b62597e165d261978564: Status 404 returned error can't find the container with id c3a1fce1c1a253ec6b10a74cee38968100ed5feb88e4b62597e165d261978564 Apr 20 22:31:11.454018 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:11.453980 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4468b674-s7ckc" event={"ID":"ea13891a-9f6f-4158-b1c2-afc15b0fb327","Type":"ContainerStarted","Data":"271623368a8fd26337c77120e00557d851ec2b1594a82624a98aba73c71bc15c"} Apr 20 22:31:11.454249 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:11.454022 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b4468b674-s7ckc" event={"ID":"ea13891a-9f6f-4158-b1c2-afc15b0fb327","Type":"ContainerStarted","Data":"c3a1fce1c1a253ec6b10a74cee38968100ed5feb88e4b62597e165d261978564"} Apr 20 22:31:11.484455 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:11.484401 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b4468b674-s7ckc" podStartSLOduration=1.4843825370000001 podStartE2EDuration="1.484382537s" podCreationTimestamp="2026-04-20 22:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:31:11.483866629 +0000 UTC m=+399.157054314" watchObservedRunningTime="2026-04-20 22:31:11.484382537 +0000 UTC m=+399.157570221" Apr 20 22:31:20.962009 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:20.961963 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:20.962009 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:20.962011 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:20.968340 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:20.968312 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:21.491113 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:21.491086 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b4468b674-s7ckc" Apr 20 22:31:39.669627 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.669595 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn"] Apr 20 22:31:39.676873 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.676841 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.680704 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.680672 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 22:31:39.681079 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.681058 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 22:31:39.681198 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.681133 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 22:31:39.681923 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.681901 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 22:31:39.682095 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.682076 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9r985\"" Apr 20 22:31:39.689649 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.689619 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn"] Apr 20 22:31:39.810380 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.810334 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/59e313b6-e3c4-4c27-a6bd-511706f74ec9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vzjtn\" (UID: \"59e313b6-e3c4-4c27-a6bd-511706f74ec9\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.810380 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.810379 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/59e313b6-e3c4-4c27-a6bd-511706f74ec9-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-vzjtn\" (UID: \"59e313b6-e3c4-4c27-a6bd-511706f74ec9\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.810621 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.810425 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57pf5\" (UniqueName: \"kubernetes.io/projected/59e313b6-e3c4-4c27-a6bd-511706f74ec9-kube-api-access-57pf5\") pod \"kuadrant-console-plugin-6cb54b5c86-vzjtn\" (UID: \"59e313b6-e3c4-4c27-a6bd-511706f74ec9\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.911959 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.911914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/59e313b6-e3c4-4c27-a6bd-511706f74ec9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vzjtn\" (UID: \"59e313b6-e3c4-4c27-a6bd-511706f74ec9\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.911959 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.911957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/59e313b6-e3c4-4c27-a6bd-511706f74ec9-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-vzjtn\" (UID: \"59e313b6-e3c4-4c27-a6bd-511706f74ec9\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.912308 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.911984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57pf5\" (UniqueName: \"kubernetes.io/projected/59e313b6-e3c4-4c27-a6bd-511706f74ec9-kube-api-access-57pf5\") pod \"kuadrant-console-plugin-6cb54b5c86-vzjtn\" (UID: \"59e313b6-e3c4-4c27-a6bd-511706f74ec9\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.912668 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.912642 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/59e313b6-e3c4-4c27-a6bd-511706f74ec9-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-vzjtn\" (UID: \"59e313b6-e3c4-4c27-a6bd-511706f74ec9\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.914578 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.914557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/59e313b6-e3c4-4c27-a6bd-511706f74ec9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vzjtn\" (UID: \"59e313b6-e3c4-4c27-a6bd-511706f74ec9\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.922798 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.922736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57pf5\" (UniqueName: \"kubernetes.io/projected/59e313b6-e3c4-4c27-a6bd-511706f74ec9-kube-api-access-57pf5\") pod \"kuadrant-console-plugin-6cb54b5c86-vzjtn\" (UID: \"59e313b6-e3c4-4c27-a6bd-511706f74ec9\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:39.986806 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:39.986770 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" Apr 20 22:31:40.139001 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:40.138965 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn"] Apr 20 22:31:40.145022 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:31:40.142902 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59e313b6_e3c4_4c27_a6bd_511706f74ec9.slice/crio-043873d64b21c4cc3625196669e4b527cf4cd6d585c09d339cb071335944753e WatchSource:0}: Error finding container 043873d64b21c4cc3625196669e4b527cf4cd6d585c09d339cb071335944753e: Status 404 returned error can't find the container with id 043873d64b21c4cc3625196669e4b527cf4cd6d585c09d339cb071335944753e Apr 20 22:31:40.549641 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:31:40.549607 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" event={"ID":"59e313b6-e3c4-4c27-a6bd-511706f74ec9","Type":"ContainerStarted","Data":"043873d64b21c4cc3625196669e4b527cf4cd6d585c09d339cb071335944753e"} Apr 20 22:32:05.647374 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:05.647330 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" event={"ID":"59e313b6-e3c4-4c27-a6bd-511706f74ec9","Type":"ContainerStarted","Data":"97205e2fe92f8aa0ef835551aff2214290c6a3c32c992035367c3ea430226c84"} Apr 20 22:32:05.666266 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:05.666204 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vzjtn" podStartSLOduration=1.6473581149999998 podStartE2EDuration="26.666190238s" podCreationTimestamp="2026-04-20 22:31:39 +0000 UTC" firstStartedPulling="2026-04-20 22:31:40.146228424 +0000 UTC m=+427.819416087" lastFinishedPulling="2026-04-20 22:32:05.165060537 +0000 UTC m=+452.838248210" observedRunningTime="2026-04-20 22:32:05.664409451 +0000 UTC m=+453.337597180" watchObservedRunningTime="2026-04-20 22:32:05.666190238 +0000 UTC m=+453.339377922" Apr 20 22:32:25.152138 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.152093 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wm8c2"] Apr 20 22:32:25.180539 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.180509 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wm8c2"] Apr 20 22:32:25.180699 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.180635 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:25.183544 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.183515 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 22:32:25.242603 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.242551 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h69ws\" (UniqueName: \"kubernetes.io/projected/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-kube-api-access-h69ws\") pod \"limitador-limitador-7d549b5b-wm8c2\" (UID: \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:25.242793 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.242674 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-config-file\") pod \"limitador-limitador-7d549b5b-wm8c2\" (UID: \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:25.251926 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.251889 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wm8c2"] Apr 20 22:32:25.343538 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.343504 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-config-file\") pod \"limitador-limitador-7d549b5b-wm8c2\" (UID: \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:25.343717 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.343588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h69ws\" (UniqueName: \"kubernetes.io/projected/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-kube-api-access-h69ws\") pod \"limitador-limitador-7d549b5b-wm8c2\" (UID: \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:25.344172 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.344127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-config-file\") pod \"limitador-limitador-7d549b5b-wm8c2\" (UID: \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:25.352311 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.352287 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h69ws\" (UniqueName: \"kubernetes.io/projected/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-kube-api-access-h69ws\") pod \"limitador-limitador-7d549b5b-wm8c2\" (UID: \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:25.492748 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.492705 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:25.619595 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.619568 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wm8c2"] Apr 20 22:32:25.621652 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:32:25.621621 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a7f7ed8_4433_4781_aff1_2792a4ef8d12.slice/crio-f70e6ab51e5e125e30f8da72247fe601baa190e17612103161bd5936701978ca WatchSource:0}: Error finding container f70e6ab51e5e125e30f8da72247fe601baa190e17612103161bd5936701978ca: Status 404 returned error can't find the container with id f70e6ab51e5e125e30f8da72247fe601baa190e17612103161bd5936701978ca Apr 20 22:32:25.715875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.715839 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" event={"ID":"5a7f7ed8-4433-4781-aff1-2792a4ef8d12","Type":"ContainerStarted","Data":"f70e6ab51e5e125e30f8da72247fe601baa190e17612103161bd5936701978ca"} Apr 20 22:32:25.962069 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:25.962034 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xvj9c"] Apr 20 22:32:26.036173 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:26.036108 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xvj9c"] Apr 20 22:32:26.036364 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:26.036281 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" Apr 20 22:32:26.039214 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:26.039190 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-k66vw\"" Apr 20 22:32:26.151128 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:26.151094 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5wjg\" (UniqueName: \"kubernetes.io/projected/be8d2fc4-10c9-48a6-afed-faa8c199613f-kube-api-access-k5wjg\") pod \"authorino-f99f4b5cd-xvj9c\" (UID: \"be8d2fc4-10c9-48a6-afed-faa8c199613f\") " pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" Apr 20 22:32:26.252728 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:26.252642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5wjg\" (UniqueName: \"kubernetes.io/projected/be8d2fc4-10c9-48a6-afed-faa8c199613f-kube-api-access-k5wjg\") pod \"authorino-f99f4b5cd-xvj9c\" (UID: \"be8d2fc4-10c9-48a6-afed-faa8c199613f\") " pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" Apr 20 22:32:26.261730 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:26.261679 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5wjg\" (UniqueName: \"kubernetes.io/projected/be8d2fc4-10c9-48a6-afed-faa8c199613f-kube-api-access-k5wjg\") pod \"authorino-f99f4b5cd-xvj9c\" (UID: \"be8d2fc4-10c9-48a6-afed-faa8c199613f\") " pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" Apr 20 22:32:26.346110 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:26.346073 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" Apr 20 22:32:26.517366 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:26.517331 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xvj9c"] Apr 20 22:32:26.520713 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:32:26.520678 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe8d2fc4_10c9_48a6_afed_faa8c199613f.slice/crio-0b059c3719409b48571fe6b038550f28395e8b610c8575ceb2d47f903063739f WatchSource:0}: Error finding container 0b059c3719409b48571fe6b038550f28395e8b610c8575ceb2d47f903063739f: Status 404 returned error can't find the container with id 0b059c3719409b48571fe6b038550f28395e8b610c8575ceb2d47f903063739f Apr 20 22:32:26.721340 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:26.721288 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" event={"ID":"be8d2fc4-10c9-48a6-afed-faa8c199613f","Type":"ContainerStarted","Data":"0b059c3719409b48571fe6b038550f28395e8b610c8575ceb2d47f903063739f"} Apr 20 22:32:29.734762 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:29.734720 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" event={"ID":"5a7f7ed8-4433-4781-aff1-2792a4ef8d12","Type":"ContainerStarted","Data":"a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0"} Apr 20 22:32:29.735213 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:29.734853 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:29.753062 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:29.753002 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" podStartSLOduration=1.038478863 podStartE2EDuration="4.75298081s" podCreationTimestamp="2026-04-20 22:32:25 +0000 UTC" firstStartedPulling="2026-04-20 22:32:25.623569837 +0000 UTC m=+473.296757503" lastFinishedPulling="2026-04-20 22:32:29.338071775 +0000 UTC m=+477.011259450" observedRunningTime="2026-04-20 22:32:29.752649775 +0000 UTC m=+477.425837462" watchObservedRunningTime="2026-04-20 22:32:29.75298081 +0000 UTC m=+477.426168497" Apr 20 22:32:30.793707 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:30.793664 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xvj9c"] Apr 20 22:32:31.742360 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:31.742322 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" event={"ID":"be8d2fc4-10c9-48a6-afed-faa8c199613f","Type":"ContainerStarted","Data":"bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41"} Apr 20 22:32:31.742541 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:31.742409 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" podUID="be8d2fc4-10c9-48a6-afed-faa8c199613f" containerName="authorino" containerID="cri-o://bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41" gracePeriod=30 Apr 20 22:32:31.757285 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:31.757239 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" podStartSLOduration=2.193286406 podStartE2EDuration="6.757224184s" podCreationTimestamp="2026-04-20 22:32:25 +0000 UTC" firstStartedPulling="2026-04-20 22:32:26.523784417 +0000 UTC m=+474.196972085" lastFinishedPulling="2026-04-20 22:32:31.087722192 +0000 UTC m=+478.760909863" observedRunningTime="2026-04-20 22:32:31.75557999 +0000 UTC m=+479.428767669" watchObservedRunningTime="2026-04-20 22:32:31.757224184 +0000 UTC m=+479.430411913" Apr 20 22:32:31.973936 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:31.973910 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" Apr 20 22:32:32.116070 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.115978 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5wjg\" (UniqueName: \"kubernetes.io/projected/be8d2fc4-10c9-48a6-afed-faa8c199613f-kube-api-access-k5wjg\") pod \"be8d2fc4-10c9-48a6-afed-faa8c199613f\" (UID: \"be8d2fc4-10c9-48a6-afed-faa8c199613f\") " Apr 20 22:32:32.118279 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.118248 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be8d2fc4-10c9-48a6-afed-faa8c199613f-kube-api-access-k5wjg" (OuterVolumeSpecName: "kube-api-access-k5wjg") pod "be8d2fc4-10c9-48a6-afed-faa8c199613f" (UID: "be8d2fc4-10c9-48a6-afed-faa8c199613f"). InnerVolumeSpecName "kube-api-access-k5wjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:32:32.217360 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.217317 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5wjg\" (UniqueName: \"kubernetes.io/projected/be8d2fc4-10c9-48a6-afed-faa8c199613f-kube-api-access-k5wjg\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:32:32.747397 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.747360 2568 generic.go:358] "Generic (PLEG): container finished" podID="be8d2fc4-10c9-48a6-afed-faa8c199613f" containerID="bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41" exitCode=0 Apr 20 22:32:32.747566 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.747407 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" Apr 20 22:32:32.747566 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.747449 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" event={"ID":"be8d2fc4-10c9-48a6-afed-faa8c199613f","Type":"ContainerDied","Data":"bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41"} Apr 20 22:32:32.747566 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.747486 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-xvj9c" event={"ID":"be8d2fc4-10c9-48a6-afed-faa8c199613f","Type":"ContainerDied","Data":"0b059c3719409b48571fe6b038550f28395e8b610c8575ceb2d47f903063739f"} Apr 20 22:32:32.747566 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.747501 2568 scope.go:117] "RemoveContainer" containerID="bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41" Apr 20 22:32:32.755823 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.755802 2568 scope.go:117] "RemoveContainer" containerID="bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41" Apr 20 22:32:32.756122 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:32:32.756093 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41\": container with ID starting with bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41 not found: ID does not exist" containerID="bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41" Apr 20 22:32:32.756200 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.756129 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41"} err="failed to get container status \"bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41\": rpc error: code = NotFound desc = could not find container \"bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41\": container with ID starting with bffde8f268367cf9cdaf6a82535966ed87253c4448cbb0923bea6f40bee42f41 not found: ID does not exist" Apr 20 22:32:32.767296 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.767263 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xvj9c"] Apr 20 22:32:32.770862 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.770835 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-xvj9c"] Apr 20 22:32:32.913171 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:32.913111 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be8d2fc4-10c9-48a6-afed-faa8c199613f" path="/var/lib/kubelet/pods/be8d2fc4-10c9-48a6-afed-faa8c199613f/volumes" Apr 20 22:32:39.792517 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:39.792477 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wm8c2"] Apr 20 22:32:39.792977 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:39.792721 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" podUID="5a7f7ed8-4433-4781-aff1-2792a4ef8d12" containerName="limitador" containerID="cri-o://a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0" gracePeriod=30 Apr 20 22:32:39.793429 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:39.793411 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:40.362181 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.362095 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:40.386797 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.386767 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h69ws\" (UniqueName: \"kubernetes.io/projected/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-kube-api-access-h69ws\") pod \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\" (UID: \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\") " Apr 20 22:32:40.386797 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.386805 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-config-file\") pod \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\" (UID: \"5a7f7ed8-4433-4781-aff1-2792a4ef8d12\") " Apr 20 22:32:40.387211 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.387190 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-config-file" (OuterVolumeSpecName: "config-file") pod "5a7f7ed8-4433-4781-aff1-2792a4ef8d12" (UID: "5a7f7ed8-4433-4781-aff1-2792a4ef8d12"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:32:40.388885 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.388861 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-kube-api-access-h69ws" (OuterVolumeSpecName: "kube-api-access-h69ws") pod "5a7f7ed8-4433-4781-aff1-2792a4ef8d12" (UID: "5a7f7ed8-4433-4781-aff1-2792a4ef8d12"). InnerVolumeSpecName "kube-api-access-h69ws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:32:40.487408 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.487369 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h69ws\" (UniqueName: \"kubernetes.io/projected/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-kube-api-access-h69ws\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:32:40.487408 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.487402 2568 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5a7f7ed8-4433-4781-aff1-2792a4ef8d12-config-file\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:32:40.774873 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.774829 2568 generic.go:358] "Generic (PLEG): container finished" podID="5a7f7ed8-4433-4781-aff1-2792a4ef8d12" containerID="a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0" exitCode=0 Apr 20 22:32:40.775044 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.774890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" event={"ID":"5a7f7ed8-4433-4781-aff1-2792a4ef8d12","Type":"ContainerDied","Data":"a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0"} Apr 20 22:32:40.775044 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.774909 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" Apr 20 22:32:40.775044 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.774922 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-wm8c2" event={"ID":"5a7f7ed8-4433-4781-aff1-2792a4ef8d12","Type":"ContainerDied","Data":"f70e6ab51e5e125e30f8da72247fe601baa190e17612103161bd5936701978ca"} Apr 20 22:32:40.775044 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.774942 2568 scope.go:117] "RemoveContainer" containerID="a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0" Apr 20 22:32:40.783256 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.783232 2568 scope.go:117] "RemoveContainer" containerID="a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0" Apr 20 22:32:40.783559 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:32:40.783536 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0\": container with ID starting with a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0 not found: ID does not exist" containerID="a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0" Apr 20 22:32:40.783609 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.783564 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0"} err="failed to get container status \"a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0\": rpc error: code = NotFound desc = could not find container \"a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0\": container with ID starting with a218d611a1b863d6f35b9a6684b11cbbec2851d9397a39fddd96e52c6e700be0 not found: ID does not exist" Apr 20 22:32:40.795105 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.795066 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wm8c2"] Apr 20 22:32:40.800132 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.800092 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wm8c2"] Apr 20 22:32:40.913425 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:40.913393 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a7f7ed8-4433-4781-aff1-2792a4ef8d12" path="/var/lib/kubelet/pods/5a7f7ed8-4433-4781-aff1-2792a4ef8d12/volumes" Apr 20 22:32:41.207852 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.207816 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-ghhph"] Apr 20 22:32:41.208182 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.208166 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="be8d2fc4-10c9-48a6-afed-faa8c199613f" containerName="authorino" Apr 20 22:32:41.208242 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.208187 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="be8d2fc4-10c9-48a6-afed-faa8c199613f" containerName="authorino" Apr 20 22:32:41.208242 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.208207 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a7f7ed8-4433-4781-aff1-2792a4ef8d12" containerName="limitador" Apr 20 22:32:41.208242 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.208217 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a7f7ed8-4433-4781-aff1-2792a4ef8d12" containerName="limitador" Apr 20 22:32:41.208338 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.208312 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="be8d2fc4-10c9-48a6-afed-faa8c199613f" containerName="authorino" Apr 20 22:32:41.208338 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.208329 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a7f7ed8-4433-4781-aff1-2792a4ef8d12" containerName="limitador" Apr 20 22:32:41.216705 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.216246 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:32:41.219104 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.219073 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-42r9d\"" Apr 20 22:32:41.219303 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.219089 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 22:32:41.219652 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.219626 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-ghhph"] Apr 20 22:32:41.292705 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.292663 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/54575915-5205-4036-a330-79b768a5073f-data\") pod \"postgres-868db5846d-ghhph\" (UID: \"54575915-5205-4036-a330-79b768a5073f\") " pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:32:41.292705 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.292705 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqjxt\" (UniqueName: \"kubernetes.io/projected/54575915-5205-4036-a330-79b768a5073f-kube-api-access-jqjxt\") pod \"postgres-868db5846d-ghhph\" (UID: \"54575915-5205-4036-a330-79b768a5073f\") " pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:32:41.393930 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.393888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/54575915-5205-4036-a330-79b768a5073f-data\") pod \"postgres-868db5846d-ghhph\" (UID: \"54575915-5205-4036-a330-79b768a5073f\") " pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:32:41.393930 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.393931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqjxt\" (UniqueName: \"kubernetes.io/projected/54575915-5205-4036-a330-79b768a5073f-kube-api-access-jqjxt\") pod \"postgres-868db5846d-ghhph\" (UID: \"54575915-5205-4036-a330-79b768a5073f\") " pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:32:41.394350 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.394329 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/54575915-5205-4036-a330-79b768a5073f-data\") pod \"postgres-868db5846d-ghhph\" (UID: \"54575915-5205-4036-a330-79b768a5073f\") " pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:32:41.402409 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.402377 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqjxt\" (UniqueName: \"kubernetes.io/projected/54575915-5205-4036-a330-79b768a5073f-kube-api-access-jqjxt\") pod \"postgres-868db5846d-ghhph\" (UID: \"54575915-5205-4036-a330-79b768a5073f\") " pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:32:41.529466 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.529365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:32:41.659765 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.659740 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-ghhph"] Apr 20 22:32:41.662577 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:32:41.662547 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54575915_5205_4036_a330_79b768a5073f.slice/crio-40c7bd2e85e00cd91615bfd911a3e4c4f8e11f9ba682037def1f48fc11e2c83a WatchSource:0}: Error finding container 40c7bd2e85e00cd91615bfd911a3e4c4f8e11f9ba682037def1f48fc11e2c83a: Status 404 returned error can't find the container with id 40c7bd2e85e00cd91615bfd911a3e4c4f8e11f9ba682037def1f48fc11e2c83a Apr 20 22:32:41.779231 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:41.779191 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-ghhph" event={"ID":"54575915-5205-4036-a330-79b768a5073f","Type":"ContainerStarted","Data":"40c7bd2e85e00cd91615bfd911a3e4c4f8e11f9ba682037def1f48fc11e2c83a"} Apr 20 22:32:46.800648 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:46.800605 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-ghhph" event={"ID":"54575915-5205-4036-a330-79b768a5073f","Type":"ContainerStarted","Data":"1e6d21b41cac96659d970ef6e20dd6e408cab54acf45b19317d61375aeb8c105"} Apr 20 22:32:46.801036 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:46.800669 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:32:46.819344 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:46.819289 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-ghhph" podStartSLOduration=0.878106444 podStartE2EDuration="5.819271443s" podCreationTimestamp="2026-04-20 22:32:41 +0000 UTC" firstStartedPulling="2026-04-20 22:32:41.663896761 +0000 UTC m=+489.337084427" lastFinishedPulling="2026-04-20 22:32:46.605061753 +0000 UTC m=+494.278249426" observedRunningTime="2026-04-20 22:32:46.818128065 +0000 UTC m=+494.491315751" watchObservedRunningTime="2026-04-20 22:32:46.819271443 +0000 UTC m=+494.492459198" Apr 20 22:32:52.836734 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:32:52.836699 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-ghhph" Apr 20 22:33:01.947268 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:01.947234 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-22h9d"] Apr 20 22:33:01.951967 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:01.951940 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-22h9d" Apr 20 22:33:01.955145 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:01.955098 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 22:33:01.956369 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:01.956348 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-j9mzm\"" Apr 20 22:33:01.956510 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:01.956348 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 22:33:01.976494 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:01.976447 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-22h9d"] Apr 20 22:33:02.067769 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:02.067728 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wffnw\" (UniqueName: \"kubernetes.io/projected/f80c5dfe-1bc7-4f4b-8fe2-d4975a1c5458-kube-api-access-wffnw\") pod \"keycloak-operator-5c4df598dd-22h9d\" (UID: \"f80c5dfe-1bc7-4f4b-8fe2-d4975a1c5458\") " pod="keycloak-system/keycloak-operator-5c4df598dd-22h9d" Apr 20 22:33:02.169348 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:02.169304 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wffnw\" (UniqueName: \"kubernetes.io/projected/f80c5dfe-1bc7-4f4b-8fe2-d4975a1c5458-kube-api-access-wffnw\") pod \"keycloak-operator-5c4df598dd-22h9d\" (UID: \"f80c5dfe-1bc7-4f4b-8fe2-d4975a1c5458\") " pod="keycloak-system/keycloak-operator-5c4df598dd-22h9d" Apr 20 22:33:02.190107 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:02.190074 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wffnw\" (UniqueName: \"kubernetes.io/projected/f80c5dfe-1bc7-4f4b-8fe2-d4975a1c5458-kube-api-access-wffnw\") pod \"keycloak-operator-5c4df598dd-22h9d\" (UID: \"f80c5dfe-1bc7-4f4b-8fe2-d4975a1c5458\") " pod="keycloak-system/keycloak-operator-5c4df598dd-22h9d" Apr 20 22:33:02.263722 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:02.263625 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-22h9d" Apr 20 22:33:02.395795 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:02.395759 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-22h9d"] Apr 20 22:33:02.399773 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:33:02.399739 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf80c5dfe_1bc7_4f4b_8fe2_d4975a1c5458.slice/crio-e7e0c6f0439dc5bf8adcb2f03a7e0b500c7d60caa67e712f2f517cf782d55e33 WatchSource:0}: Error finding container e7e0c6f0439dc5bf8adcb2f03a7e0b500c7d60caa67e712f2f517cf782d55e33: Status 404 returned error can't find the container with id e7e0c6f0439dc5bf8adcb2f03a7e0b500c7d60caa67e712f2f517cf782d55e33 Apr 20 22:33:02.857137 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:02.857100 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-22h9d" event={"ID":"f80c5dfe-1bc7-4f4b-8fe2-d4975a1c5458","Type":"ContainerStarted","Data":"e7e0c6f0439dc5bf8adcb2f03a7e0b500c7d60caa67e712f2f517cf782d55e33"} Apr 20 22:33:08.880025 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:08.879985 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-22h9d" event={"ID":"f80c5dfe-1bc7-4f4b-8fe2-d4975a1c5458","Type":"ContainerStarted","Data":"754de439cbf9f0b8a774aab9ea23e3f045a1a9957f7faa592fb5805783b772b4"} Apr 20 22:33:08.900699 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:08.900647 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-22h9d" podStartSLOduration=2.385812432 podStartE2EDuration="7.90063268s" podCreationTimestamp="2026-04-20 22:33:01 +0000 UTC" firstStartedPulling="2026-04-20 22:33:02.40102181 +0000 UTC m=+510.074209476" lastFinishedPulling="2026-04-20 22:33:07.915842039 +0000 UTC m=+515.589029724" observedRunningTime="2026-04-20 22:33:08.899450489 +0000 UTC m=+516.572638176" watchObservedRunningTime="2026-04-20 22:33:08.90063268 +0000 UTC m=+516.573820364" Apr 20 22:33:48.203929 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.203891 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4ltpm"] Apr 20 22:33:48.211309 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.211286 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" Apr 20 22:33:48.214402 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.214372 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-k66vw\"" Apr 20 22:33:48.215395 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.215368 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4ltpm"] Apr 20 22:33:48.273998 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.273962 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zhxc\" (UniqueName: \"kubernetes.io/projected/9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad-kube-api-access-8zhxc\") pod \"authorino-8b475cf9f-4ltpm\" (UID: \"9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad\") " pod="kuadrant-system/authorino-8b475cf9f-4ltpm" Apr 20 22:33:48.374605 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.374561 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zhxc\" (UniqueName: \"kubernetes.io/projected/9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad-kube-api-access-8zhxc\") pod \"authorino-8b475cf9f-4ltpm\" (UID: \"9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad\") " pod="kuadrant-system/authorino-8b475cf9f-4ltpm" Apr 20 22:33:48.382909 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.382870 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zhxc\" (UniqueName: \"kubernetes.io/projected/9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad-kube-api-access-8zhxc\") pod \"authorino-8b475cf9f-4ltpm\" (UID: \"9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad\") " pod="kuadrant-system/authorino-8b475cf9f-4ltpm" Apr 20 22:33:48.445929 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.445888 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4ltpm"] Apr 20 22:33:48.446167 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.446140 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" Apr 20 22:33:48.474798 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.474762 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5ff479985b-87tvd"] Apr 20 22:33:48.477673 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.477649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5ff479985b-87tvd" Apr 20 22:33:48.482886 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.482854 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5ff479985b-87tvd"] Apr 20 22:33:48.572894 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.572869 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4ltpm"] Apr 20 22:33:48.575420 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:33:48.575381 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e53d2d3_3b34_4ee0_85c1_56a8a870a7ad.slice/crio-7c49a78ab1bad33bff641d26c10d2fb17d3fd5531c9e2bd2350b5e60a6d30c8a WatchSource:0}: Error finding container 7c49a78ab1bad33bff641d26c10d2fb17d3fd5531c9e2bd2350b5e60a6d30c8a: Status 404 returned error can't find the container with id 7c49a78ab1bad33bff641d26c10d2fb17d3fd5531c9e2bd2350b5e60a6d30c8a Apr 20 22:33:48.575570 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.575543 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prcdt\" (UniqueName: \"kubernetes.io/projected/1854d165-b822-4212-acf5-74e28c86b98d-kube-api-access-prcdt\") pod \"authorino-5ff479985b-87tvd\" (UID: \"1854d165-b822-4212-acf5-74e28c86b98d\") " pod="kuadrant-system/authorino-5ff479985b-87tvd" Apr 20 22:33:48.676966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.676923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prcdt\" (UniqueName: \"kubernetes.io/projected/1854d165-b822-4212-acf5-74e28c86b98d-kube-api-access-prcdt\") pod \"authorino-5ff479985b-87tvd\" (UID: \"1854d165-b822-4212-acf5-74e28c86b98d\") " pod="kuadrant-system/authorino-5ff479985b-87tvd" Apr 20 22:33:48.686188 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.686144 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prcdt\" (UniqueName: \"kubernetes.io/projected/1854d165-b822-4212-acf5-74e28c86b98d-kube-api-access-prcdt\") pod \"authorino-5ff479985b-87tvd\" (UID: \"1854d165-b822-4212-acf5-74e28c86b98d\") " pod="kuadrant-system/authorino-5ff479985b-87tvd" Apr 20 22:33:48.707060 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.706970 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5ff479985b-87tvd"] Apr 20 22:33:48.707250 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.707238 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5ff479985b-87tvd" Apr 20 22:33:48.736166 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.736121 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6b78b7845b-n7cws"] Apr 20 22:33:48.739866 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.739834 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6b78b7845b-n7cws" Apr 20 22:33:48.742472 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.742443 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 22:33:48.749140 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.749114 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6b78b7845b-n7cws"] Apr 20 22:33:48.777859 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.777818 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7145360d-78e1-418e-89f8-0af26b1cb9a0-tls-cert\") pod \"authorino-6b78b7845b-n7cws\" (UID: \"7145360d-78e1-418e-89f8-0af26b1cb9a0\") " pod="kuadrant-system/authorino-6b78b7845b-n7cws" Apr 20 22:33:48.778036 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.778002 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7d55\" (UniqueName: \"kubernetes.io/projected/7145360d-78e1-418e-89f8-0af26b1cb9a0-kube-api-access-d7d55\") pod \"authorino-6b78b7845b-n7cws\" (UID: \"7145360d-78e1-418e-89f8-0af26b1cb9a0\") " pod="kuadrant-system/authorino-6b78b7845b-n7cws" Apr 20 22:33:48.834680 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.834652 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5ff479985b-87tvd"] Apr 20 22:33:48.837178 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:33:48.837136 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1854d165_b822_4212_acf5_74e28c86b98d.slice/crio-f5322de1730664d0ea72e313d19e9d8d527d27859de3889f0f1a5656ec740d4a WatchSource:0}: Error finding container f5322de1730664d0ea72e313d19e9d8d527d27859de3889f0f1a5656ec740d4a: Status 404 returned error can't find the container with id f5322de1730664d0ea72e313d19e9d8d527d27859de3889f0f1a5656ec740d4a Apr 20 22:33:48.879028 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.878990 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7d55\" (UniqueName: \"kubernetes.io/projected/7145360d-78e1-418e-89f8-0af26b1cb9a0-kube-api-access-d7d55\") pod \"authorino-6b78b7845b-n7cws\" (UID: \"7145360d-78e1-418e-89f8-0af26b1cb9a0\") " pod="kuadrant-system/authorino-6b78b7845b-n7cws" Apr 20 22:33:48.879231 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.879073 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7145360d-78e1-418e-89f8-0af26b1cb9a0-tls-cert\") pod \"authorino-6b78b7845b-n7cws\" (UID: \"7145360d-78e1-418e-89f8-0af26b1cb9a0\") " pod="kuadrant-system/authorino-6b78b7845b-n7cws" Apr 20 22:33:48.881636 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.881617 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7145360d-78e1-418e-89f8-0af26b1cb9a0-tls-cert\") pod \"authorino-6b78b7845b-n7cws\" (UID: \"7145360d-78e1-418e-89f8-0af26b1cb9a0\") " pod="kuadrant-system/authorino-6b78b7845b-n7cws" Apr 20 22:33:48.887357 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:48.887336 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7d55\" (UniqueName: \"kubernetes.io/projected/7145360d-78e1-418e-89f8-0af26b1cb9a0-kube-api-access-d7d55\") pod \"authorino-6b78b7845b-n7cws\" (UID: \"7145360d-78e1-418e-89f8-0af26b1cb9a0\") " pod="kuadrant-system/authorino-6b78b7845b-n7cws" Apr 20 22:33:49.011444 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.011306 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" event={"ID":"9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad","Type":"ContainerStarted","Data":"6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1"} Apr 20 22:33:49.011444 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.011312 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" podUID="9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad" containerName="authorino" containerID="cri-o://6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1" gracePeriod=30 Apr 20 22:33:49.011444 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.011353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" event={"ID":"9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad","Type":"ContainerStarted","Data":"7c49a78ab1bad33bff641d26c10d2fb17d3fd5531c9e2bd2350b5e60a6d30c8a"} Apr 20 22:33:49.012534 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.012494 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5ff479985b-87tvd" event={"ID":"1854d165-b822-4212-acf5-74e28c86b98d","Type":"ContainerStarted","Data":"f5322de1730664d0ea72e313d19e9d8d527d27859de3889f0f1a5656ec740d4a"} Apr 20 22:33:49.026560 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.026515 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" podStartSLOduration=0.695568855 podStartE2EDuration="1.026499473s" podCreationTimestamp="2026-04-20 22:33:48 +0000 UTC" firstStartedPulling="2026-04-20 22:33:48.576799853 +0000 UTC m=+556.249987520" lastFinishedPulling="2026-04-20 22:33:48.907730467 +0000 UTC m=+556.580918138" observedRunningTime="2026-04-20 22:33:49.025266887 +0000 UTC m=+556.698454573" watchObservedRunningTime="2026-04-20 22:33:49.026499473 +0000 UTC m=+556.699687152" Apr 20 22:33:49.052490 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.052452 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6b78b7845b-n7cws" Apr 20 22:33:49.183109 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.182432 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6b78b7845b-n7cws"] Apr 20 22:33:49.186626 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:33:49.186570 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7145360d_78e1_418e_89f8_0af26b1cb9a0.slice/crio-f30577eace97eb36a30d18064d4c8f70966f04a60c6807714874aff232fd3e00 WatchSource:0}: Error finding container f30577eace97eb36a30d18064d4c8f70966f04a60c6807714874aff232fd3e00: Status 404 returned error can't find the container with id f30577eace97eb36a30d18064d4c8f70966f04a60c6807714874aff232fd3e00 Apr 20 22:33:49.253633 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.253608 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" Apr 20 22:33:49.283990 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.283888 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zhxc\" (UniqueName: \"kubernetes.io/projected/9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad-kube-api-access-8zhxc\") pod \"9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad\" (UID: \"9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad\") " Apr 20 22:33:49.286786 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.286744 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad-kube-api-access-8zhxc" (OuterVolumeSpecName: "kube-api-access-8zhxc") pod "9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad" (UID: "9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad"). InnerVolumeSpecName "kube-api-access-8zhxc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:33:49.385446 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:49.385404 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8zhxc\" (UniqueName: \"kubernetes.io/projected/9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad-kube-api-access-8zhxc\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:33:50.017562 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.017518 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5ff479985b-87tvd" event={"ID":"1854d165-b822-4212-acf5-74e28c86b98d","Type":"ContainerStarted","Data":"02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095"} Apr 20 22:33:50.017562 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.017517 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5ff479985b-87tvd" podUID="1854d165-b822-4212-acf5-74e28c86b98d" containerName="authorino" containerID="cri-o://02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095" gracePeriod=30 Apr 20 22:33:50.018611 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.018581 2568 generic.go:358] "Generic (PLEG): container finished" podID="9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad" containerID="6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1" exitCode=0 Apr 20 22:33:50.018751 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.018626 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" Apr 20 22:33:50.018751 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.018660 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" event={"ID":"9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad","Type":"ContainerDied","Data":"6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1"} Apr 20 22:33:50.018751 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.018698 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4ltpm" event={"ID":"9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad","Type":"ContainerDied","Data":"7c49a78ab1bad33bff641d26c10d2fb17d3fd5531c9e2bd2350b5e60a6d30c8a"} Apr 20 22:33:50.018751 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.018719 2568 scope.go:117] "RemoveContainer" containerID="6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1" Apr 20 22:33:50.020029 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.020008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6b78b7845b-n7cws" event={"ID":"7145360d-78e1-418e-89f8-0af26b1cb9a0","Type":"ContainerStarted","Data":"d387ba4b6501803d4e815625e43207ac566a69b4d2957e4c8f9115added68fbf"} Apr 20 22:33:50.020171 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.020034 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6b78b7845b-n7cws" event={"ID":"7145360d-78e1-418e-89f8-0af26b1cb9a0","Type":"ContainerStarted","Data":"f30577eace97eb36a30d18064d4c8f70966f04a60c6807714874aff232fd3e00"} Apr 20 22:33:50.027902 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.027880 2568 scope.go:117] "RemoveContainer" containerID="6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1" Apr 20 22:33:50.028231 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:33:50.028213 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1\": container with ID starting with 6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1 not found: ID does not exist" containerID="6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1" Apr 20 22:33:50.028305 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.028239 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1"} err="failed to get container status \"6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1\": rpc error: code = NotFound desc = could not find container \"6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1\": container with ID starting with 6d07a2f87bbf65522d96038f14a0b8efd91891a51586790cc6fdc6c6320540d1 not found: ID does not exist" Apr 20 22:33:50.033118 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.033078 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5ff479985b-87tvd" podStartSLOduration=1.7016004009999999 podStartE2EDuration="2.033066438s" podCreationTimestamp="2026-04-20 22:33:48 +0000 UTC" firstStartedPulling="2026-04-20 22:33:48.838579452 +0000 UTC m=+556.511767118" lastFinishedPulling="2026-04-20 22:33:49.170045479 +0000 UTC m=+556.843233155" observedRunningTime="2026-04-20 22:33:50.03205054 +0000 UTC m=+557.705238226" watchObservedRunningTime="2026-04-20 22:33:50.033066438 +0000 UTC m=+557.706254169" Apr 20 22:33:50.054106 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.054057 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6b78b7845b-n7cws" podStartSLOduration=1.755480103 podStartE2EDuration="2.054041404s" podCreationTimestamp="2026-04-20 22:33:48 +0000 UTC" firstStartedPulling="2026-04-20 22:33:49.188566098 +0000 UTC m=+556.861753763" lastFinishedPulling="2026-04-20 22:33:49.487127398 +0000 UTC m=+557.160315064" observedRunningTime="2026-04-20 22:33:50.052656511 +0000 UTC m=+557.725844196" watchObservedRunningTime="2026-04-20 22:33:50.054041404 +0000 UTC m=+557.727229111" Apr 20 22:33:50.091145 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.091099 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4ltpm"] Apr 20 22:33:50.096441 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.096414 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4ltpm"] Apr 20 22:33:50.251430 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.251403 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5ff479985b-87tvd" Apr 20 22:33:50.293761 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.293675 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prcdt\" (UniqueName: \"kubernetes.io/projected/1854d165-b822-4212-acf5-74e28c86b98d-kube-api-access-prcdt\") pod \"1854d165-b822-4212-acf5-74e28c86b98d\" (UID: \"1854d165-b822-4212-acf5-74e28c86b98d\") " Apr 20 22:33:50.295750 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.295725 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1854d165-b822-4212-acf5-74e28c86b98d-kube-api-access-prcdt" (OuterVolumeSpecName: "kube-api-access-prcdt") pod "1854d165-b822-4212-acf5-74e28c86b98d" (UID: "1854d165-b822-4212-acf5-74e28c86b98d"). InnerVolumeSpecName "kube-api-access-prcdt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:33:50.395162 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.395118 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-prcdt\" (UniqueName: \"kubernetes.io/projected/1854d165-b822-4212-acf5-74e28c86b98d-kube-api-access-prcdt\") on node \"ip-10-0-133-201.ec2.internal\" DevicePath \"\"" Apr 20 22:33:50.914018 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:50.913981 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad" path="/var/lib/kubelet/pods/9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad/volumes" Apr 20 22:33:51.025554 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:51.025512 2568 generic.go:358] "Generic (PLEG): container finished" podID="1854d165-b822-4212-acf5-74e28c86b98d" containerID="02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095" exitCode=0 Apr 20 22:33:51.025744 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:51.025603 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5ff479985b-87tvd" Apr 20 22:33:51.025744 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:51.025642 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5ff479985b-87tvd" event={"ID":"1854d165-b822-4212-acf5-74e28c86b98d","Type":"ContainerDied","Data":"02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095"} Apr 20 22:33:51.025744 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:51.025668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5ff479985b-87tvd" event={"ID":"1854d165-b822-4212-acf5-74e28c86b98d","Type":"ContainerDied","Data":"f5322de1730664d0ea72e313d19e9d8d527d27859de3889f0f1a5656ec740d4a"} Apr 20 22:33:51.025744 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:51.025687 2568 scope.go:117] "RemoveContainer" containerID="02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095" Apr 20 22:33:51.034371 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:51.034346 2568 scope.go:117] "RemoveContainer" containerID="02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095" Apr 20 22:33:51.034672 ip-10-0-133-201 kubenswrapper[2568]: E0420 22:33:51.034636 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095\": container with ID starting with 02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095 not found: ID does not exist" containerID="02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095" Apr 20 22:33:51.034769 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:51.034679 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095"} err="failed to get container status \"02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095\": rpc error: code = NotFound desc = could not find container \"02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095\": container with ID starting with 02cb89d773e5d44e79c693fd75afcf3c70eb9240e947426c0e28d2b81f2da095 not found: ID does not exist" Apr 20 22:33:51.043575 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:51.043545 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5ff479985b-87tvd"] Apr 20 22:33:51.046973 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:51.046943 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5ff479985b-87tvd"] Apr 20 22:33:52.913322 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:33:52.913287 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1854d165-b822-4212-acf5-74e28c86b98d" path="/var/lib/kubelet/pods/1854d165-b822-4212-acf5-74e28c86b98d/volumes" Apr 20 22:34:20.351363 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:20.351330 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6b78b7845b-n7cws_7145360d-78e1-418e-89f8-0af26b1cb9a0/authorino/0.log" Apr 20 22:34:23.772677 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:23.772629 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d8d569d47-t9tvx_de45cef1-0c87-4ce3-958b-7bc29edea051/manager/0.log" Apr 20 22:34:24.019432 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:24.019384 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-ghhph_54575915-5205-4036-a330-79b768a5073f/postgres/0.log" Apr 20 22:34:25.279326 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:25.279287 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6b78b7845b-n7cws_7145360d-78e1-418e-89f8-0af26b1cb9a0/authorino/0.log" Apr 20 22:34:25.612883 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:25.612802 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-vzjtn_59e313b6-e3c4-4c27-a6bd-511706f74ec9/kuadrant-console-plugin/0.log" Apr 20 22:34:26.681653 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:26.681621 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7755c94fdf-gtfmk_2fd070d1-dbe6-4220-a085-8debaefaff0d/kube-auth-proxy/0.log" Apr 20 22:34:34.124746 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:34.124712 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m6mz4_2be881fb-1970-4504-9584-45d4d886c5a9/global-pull-secret-syncer/0.log" Apr 20 22:34:34.239895 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:34.239852 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-t89tm_d33395f3-9f3c-4562-ad4d-b1058d8551bf/konnectivity-agent/0.log" Apr 20 22:34:34.331617 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:34.331586 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-201.ec2.internal_674e8501a0d803973607282eda51f055/haproxy/0.log" Apr 20 22:34:38.582225 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:38.582197 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-6b78b7845b-n7cws_7145360d-78e1-418e-89f8-0af26b1cb9a0/authorino/0.log" Apr 20 22:34:38.660894 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:38.660863 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-vzjtn_59e313b6-e3c4-4c27-a6bd-511706f74ec9/kuadrant-console-plugin/0.log" Apr 20 22:34:40.390322 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.390236 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f2prt_bae61085-f01a-4979-8495-49df502b51b9/kube-state-metrics/0.log" Apr 20 22:34:40.410916 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.410888 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f2prt_bae61085-f01a-4979-8495-49df502b51b9/kube-rbac-proxy-main/0.log" Apr 20 22:34:40.435686 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.435654 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f2prt_bae61085-f01a-4979-8495-49df502b51b9/kube-rbac-proxy-self/0.log" Apr 20 22:34:40.470360 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.470321 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5c868c9cbc-68w4d_c2b4944e-42df-4cc7-a7fa-55aff7e04fbf/metrics-server/0.log" Apr 20 22:34:40.504413 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.504385 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-22wf7_e9da1b1c-c4fb-4597-9667-f377d36939d7/monitoring-plugin/0.log" Apr 20 22:34:40.536791 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.536766 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6jbt7_84a62217-01ac-4867-83c4-e5586c70021c/node-exporter/0.log" Apr 20 22:34:40.558875 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.558846 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6jbt7_84a62217-01ac-4867-83c4-e5586c70021c/kube-rbac-proxy/0.log" Apr 20 22:34:40.581312 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.581285 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6jbt7_84a62217-01ac-4867-83c4-e5586c70021c/init-textfile/0.log" Apr 20 22:34:40.777565 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.777535 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8kxkw_5ac8a623-4817-4b58-9c3f-57dce933db29/kube-rbac-proxy-main/0.log" Apr 20 22:34:40.801638 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.801603 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8kxkw_5ac8a623-4817-4b58-9c3f-57dce933db29/kube-rbac-proxy-self/0.log" Apr 20 22:34:40.823663 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.823632 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-8kxkw_5ac8a623-4817-4b58-9c3f-57dce933db29/openshift-state-metrics/0.log" Apr 20 22:34:40.856378 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.856339 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/prometheus/0.log" Apr 20 22:34:40.876861 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.876831 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/config-reloader/0.log" Apr 20 22:34:40.897782 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.897753 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/thanos-sidecar/0.log" Apr 20 22:34:40.920964 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.920932 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/kube-rbac-proxy-web/0.log" Apr 20 22:34:40.942110 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.942077 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/kube-rbac-proxy/0.log" Apr 20 22:34:40.966710 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.966679 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/kube-rbac-proxy-thanos/0.log" Apr 20 22:34:40.988198 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:40.988140 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3dd81110-a0ea-4ced-9e4c-9c8f87002448/init-config-reloader/0.log" Apr 20 22:34:41.169205 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:41.169099 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/thanos-query/0.log" Apr 20 22:34:41.189874 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:41.189839 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/kube-rbac-proxy-web/0.log" Apr 20 22:34:41.211943 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:41.211916 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/kube-rbac-proxy/0.log" Apr 20 22:34:41.232280 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:41.232250 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/prom-label-proxy/0.log" Apr 20 22:34:41.254179 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:41.254141 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/kube-rbac-proxy-rules/0.log" Apr 20 22:34:41.278640 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:41.278601 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-65c5db58f9-mftmf_05ee4b26-952f-4609-bd5e-75d703d80bf3/kube-rbac-proxy-metrics/0.log" Apr 20 22:34:42.773280 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.773243 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q"] Apr 20 22:34:42.773775 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.773754 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1854d165-b822-4212-acf5-74e28c86b98d" containerName="authorino" Apr 20 22:34:42.773849 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.773778 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1854d165-b822-4212-acf5-74e28c86b98d" containerName="authorino" Apr 20 22:34:42.773849 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.773796 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad" containerName="authorino" Apr 20 22:34:42.773849 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.773804 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad" containerName="authorino" Apr 20 22:34:42.773991 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.773930 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1854d165-b822-4212-acf5-74e28c86b98d" containerName="authorino" Apr 20 22:34:42.773991 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.773945 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e53d2d3-3b34-4ee0-85c1-56a8a870a7ad" containerName="authorino" Apr 20 22:34:42.776135 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.776113 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.778593 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.778572 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5wqp7\"/\"kube-root-ca.crt\"" Apr 20 22:34:42.779745 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.779725 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5wqp7\"/\"openshift-service-ca.crt\"" Apr 20 22:34:42.779799 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.779744 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5wqp7\"/\"default-dockercfg-4qwxg\"" Apr 20 22:34:42.786506 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.786479 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q"] Apr 20 22:34:42.850181 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.850129 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-lib-modules\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.850362 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.850200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-sys\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.850362 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.850236 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-proc\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.850362 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.850320 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-podres\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.850508 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.850377 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75jgl\" (UniqueName: \"kubernetes.io/projected/c2b43aff-59dd-4cf0-a287-ececebd58c6f-kube-api-access-75jgl\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.951744 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.951703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-lib-modules\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.951744 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.951749 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-sys\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.951987 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.951771 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-proc\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.951987 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.951841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-sys\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.951987 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.951857 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-podres\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.951987 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.951882 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75jgl\" (UniqueName: \"kubernetes.io/projected/c2b43aff-59dd-4cf0-a287-ececebd58c6f-kube-api-access-75jgl\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.951987 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.951902 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-proc\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.951987 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.951904 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-lib-modules\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.951987 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.951959 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c2b43aff-59dd-4cf0-a287-ececebd58c6f-podres\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:42.959714 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:42.959691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75jgl\" (UniqueName: \"kubernetes.io/projected/c2b43aff-59dd-4cf0-a287-ececebd58c6f-kube-api-access-75jgl\") pod \"perf-node-gather-daemonset-f799q\" (UID: \"c2b43aff-59dd-4cf0-a287-ececebd58c6f\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:43.086775 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:43.086680 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:43.211840 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:43.211783 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q"] Apr 20 22:34:43.214473 ip-10-0-133-201 kubenswrapper[2568]: W0420 22:34:43.214435 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc2b43aff_59dd_4cf0_a287_ececebd58c6f.slice/crio-8ee26c900d67aad9b85ae53af1bf344e9594d9f77b36c291d6e75221843ee9ce WatchSource:0}: Error finding container 8ee26c900d67aad9b85ae53af1bf344e9594d9f77b36c291d6e75221843ee9ce: Status 404 returned error can't find the container with id 8ee26c900d67aad9b85ae53af1bf344e9594d9f77b36c291d6e75221843ee9ce Apr 20 22:34:43.367532 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:43.367448 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b4468b674-s7ckc_ea13891a-9f6f-4158-b1c2-afc15b0fb327/console/0.log" Apr 20 22:34:44.219096 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:44.219060 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" event={"ID":"c2b43aff-59dd-4cf0-a287-ececebd58c6f","Type":"ContainerStarted","Data":"4fb551986ce434d17efb68828bda61af1d4e04ddcd6eedd34c84da0fbed6a433"} Apr 20 22:34:44.219509 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:44.219105 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" event={"ID":"c2b43aff-59dd-4cf0-a287-ececebd58c6f","Type":"ContainerStarted","Data":"8ee26c900d67aad9b85ae53af1bf344e9594d9f77b36c291d6e75221843ee9ce"} Apr 20 22:34:44.219509 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:44.219193 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:44.238374 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:44.238323 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" podStartSLOduration=2.238306566 podStartE2EDuration="2.238306566s" podCreationTimestamp="2026-04-20 22:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:34:44.236845743 +0000 UTC m=+611.910033465" watchObservedRunningTime="2026-04-20 22:34:44.238306566 +0000 UTC m=+611.911494252" Apr 20 22:34:44.718913 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:44.718877 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-57zvt_d07f11ad-2096-40a1-9534-a3146fa93510/dns/0.log" Apr 20 22:34:44.739707 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:44.739678 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-57zvt_d07f11ad-2096-40a1-9534-a3146fa93510/kube-rbac-proxy/0.log" Apr 20 22:34:44.871921 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:44.871890 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6cs9w_fa2a5c1d-a5d3-4341-8c92-2a050066670f/dns-node-resolver/0.log" Apr 20 22:34:45.448176 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:45.448124 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j77rr_5af6d22e-6f75-493a-a5ab-9f2a0eafa36f/node-ca/0.log" Apr 20 22:34:46.388245 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:46.388214 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7755c94fdf-gtfmk_2fd070d1-dbe6-4220-a085-8debaefaff0d/kube-auth-proxy/0.log" Apr 20 22:34:46.958715 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:46.958686 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5756b_c1e78fef-7128-47b2-a77d-46a98bb24af9/serve-healthcheck-canary/0.log" Apr 20 22:34:47.553707 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:47.553669 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-khz6t_09fe9e71-2821-4229-a355-e118b4e9f593/kube-rbac-proxy/0.log" Apr 20 22:34:47.573477 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:47.573445 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-khz6t_09fe9e71-2821-4229-a355-e118b4e9f593/exporter/0.log" Apr 20 22:34:47.595056 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:47.595029 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-khz6t_09fe9e71-2821-4229-a355-e118b4e9f593/extractor/0.log" Apr 20 22:34:49.633272 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:49.633231 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d8d569d47-t9tvx_de45cef1-0c87-4ce3-958b-7bc29edea051/manager/0.log" Apr 20 22:34:49.706066 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:49.706040 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-ghhph_54575915-5205-4036-a330-79b768a5073f/postgres/0.log" Apr 20 22:34:50.233119 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:50.233083 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-f799q" Apr 20 22:34:50.944750 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:50.944714 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-845776cd66-lznfq_967bcc7c-130b-413e-af49-f4650bda8ca6/manager/0.log" Apr 20 22:34:50.969126 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:50.969096 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-dzlm2_b2851f99-0f37-44c3-98b1-9a45a63ddce0/openshift-lws-operator/0.log" Apr 20 22:34:56.889360 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:56.889334 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npn9_e3a6b05e-7ccd-4812-b0a3-5860098b7618/kube-multus-additional-cni-plugins/0.log" Apr 20 22:34:56.909414 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:56.909391 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npn9_e3a6b05e-7ccd-4812-b0a3-5860098b7618/egress-router-binary-copy/0.log" Apr 20 22:34:56.929617 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:56.929588 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npn9_e3a6b05e-7ccd-4812-b0a3-5860098b7618/cni-plugins/0.log" Apr 20 22:34:56.951678 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:56.951648 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npn9_e3a6b05e-7ccd-4812-b0a3-5860098b7618/bond-cni-plugin/0.log" Apr 20 22:34:56.972862 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:56.972836 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npn9_e3a6b05e-7ccd-4812-b0a3-5860098b7618/routeoverride-cni/0.log" Apr 20 22:34:56.992694 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:56.992662 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npn9_e3a6b05e-7ccd-4812-b0a3-5860098b7618/whereabouts-cni-bincopy/0.log" Apr 20 22:34:57.013146 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:57.013116 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4npn9_e3a6b05e-7ccd-4812-b0a3-5860098b7618/whereabouts-cni/0.log" Apr 20 22:34:57.394754 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:57.394729 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-snx6p_99b45f86-8fd1-4884-ab21-716f105f2d77/kube-multus/0.log" Apr 20 22:34:57.413288 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:57.413251 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5kgfv_a6b93d87-66d5-4f06-b428-6cbc7fcdeda2/network-metrics-daemon/0.log" Apr 20 22:34:57.437966 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:57.437923 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5kgfv_a6b93d87-66d5-4f06-b428-6cbc7fcdeda2/kube-rbac-proxy/0.log" Apr 20 22:34:58.698321 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:58.698292 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2zfd_9f3be14a-c6d2-4e17-88f8-9129b465bd71/ovn-controller/0.log" Apr 20 22:34:58.725093 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:58.725056 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2zfd_9f3be14a-c6d2-4e17-88f8-9129b465bd71/ovn-acl-logging/0.log" Apr 20 22:34:58.746007 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:58.745976 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2zfd_9f3be14a-c6d2-4e17-88f8-9129b465bd71/kube-rbac-proxy-node/0.log" Apr 20 22:34:58.766540 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:58.766514 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2zfd_9f3be14a-c6d2-4e17-88f8-9129b465bd71/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 22:34:58.787860 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:58.787832 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2zfd_9f3be14a-c6d2-4e17-88f8-9129b465bd71/northd/0.log" Apr 20 22:34:58.807730 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:58.807694 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2zfd_9f3be14a-c6d2-4e17-88f8-9129b465bd71/nbdb/0.log" Apr 20 22:34:58.832657 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:58.832615 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2zfd_9f3be14a-c6d2-4e17-88f8-9129b465bd71/sbdb/0.log" Apr 20 22:34:59.011635 ip-10-0-133-201 kubenswrapper[2568]: I0420 22:34:59.011545 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r2zfd_9f3be14a-c6d2-4e17-88f8-9129b465bd71/ovnkube-controller/0.log"