Apr 20 13:31:14.328025 ip-10-0-133-1 systemd[1]: Starting Kubernetes Kubelet... Apr 20 13:31:14.874434 ip-10-0-133-1 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 13:31:14.874434 ip-10-0-133-1 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 13:31:14.874434 ip-10-0-133-1 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 13:31:14.874434 ip-10-0-133-1 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 13:31:14.874434 ip-10-0-133-1 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 13:31:14.878392 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.878298 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 13:31:14.882020 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.881997 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:31:14.882020 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882015 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:31:14.882020 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882020 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:31:14.882020 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882023 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:31:14.882020 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882027 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882030 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882033 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882037 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882040 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882043 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882047 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882052 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882054 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882057 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882060 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882062 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882065 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882068 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882070 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882073 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882075 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882078 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882080 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:31:14.882209 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882090 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882093 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882095 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882098 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882100 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882103 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882106 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882109 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882112 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882115 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882117 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882120 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882123 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882125 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882127 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882132 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882136 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882139 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882142 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882145 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:31:14.882686 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882147 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882151 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882155 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882157 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882160 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882162 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882165 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882168 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882171 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882174 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882176 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882179 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882182 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882185 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882187 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882190 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882193 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882195 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882197 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:31:14.883192 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882200 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882202 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882205 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882207 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882210 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882213 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882215 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882218 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882223 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882226 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882228 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882231 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882233 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882236 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882239 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882241 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882247 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882250 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882253 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882256 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882258 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:31:14.883658 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882261 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882263 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882266 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882661 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882667 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882670 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882672 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882675 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882678 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882680 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882683 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882685 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882688 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882691 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882694 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882696 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882699 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882702 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882705 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882708 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:31:14.884183 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882713 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882716 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882719 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882723 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882725 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882728 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882731 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882735 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882737 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882740 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882743 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882760 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882763 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882766 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882769 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882772 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882775 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882778 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882781 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882783 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:31:14.884659 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882786 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882788 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882791 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882794 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882797 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882799 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882802 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882806 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882810 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882812 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882815 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882818 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882821 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882823 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882826 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882829 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882831 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882834 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882836 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:31:14.885173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882839 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882842 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882844 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882847 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882850 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882853 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882855 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882858 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882861 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882863 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882866 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882868 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882871 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882873 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882876 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882879 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882881 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882884 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882886 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:31:14.885638 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882889 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882891 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882894 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882897 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882899 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882903 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882906 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882908 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882911 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882914 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.882916 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884670 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884679 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884686 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884691 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884696 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884700 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884709 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884714 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884718 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884721 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884724 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 13:31:14.886137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884728 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884731 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884734 2573 flags.go:64] FLAG: --cgroup-root="" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884737 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884740 2573 flags.go:64] FLAG: --client-ca-file="" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884742 2573 flags.go:64] FLAG: --cloud-config="" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884756 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884760 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884763 2573 flags.go:64] FLAG: --cluster-domain="" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884766 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884769 2573 flags.go:64] FLAG: --config-dir="" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884772 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884776 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884784 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884787 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884790 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884794 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884797 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884800 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884803 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884806 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884809 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884813 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884816 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884819 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 13:31:14.886686 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884822 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884827 2573 flags.go:64] FLAG: --enable-server="true" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884830 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884835 2573 flags.go:64] FLAG: --event-burst="100" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884838 2573 flags.go:64] FLAG: --event-qps="50" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884841 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884844 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884848 2573 flags.go:64] FLAG: --eviction-hard="" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884852 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884855 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884858 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884861 2573 flags.go:64] FLAG: --eviction-soft="" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884864 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884869 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884872 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884875 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884878 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884880 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884883 2573 flags.go:64] FLAG: --feature-gates="" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884887 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884892 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884895 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884898 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884901 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884904 2573 flags.go:64] FLAG: --help="false" Apr 20 13:31:14.887296 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884907 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-133-1.ec2.internal" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884910 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884914 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884917 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884920 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884923 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884926 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884929 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884932 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884936 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884939 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884942 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884944 2573 flags.go:64] FLAG: --kube-reserved="" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884947 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884951 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884954 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884957 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884960 2573 flags.go:64] FLAG: --lock-file="" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884963 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884965 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884970 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884976 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884979 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884982 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 13:31:14.887944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884984 2573 flags.go:64] FLAG: --logging-format="text" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884987 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884990 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884994 2573 flags.go:64] FLAG: --manifest-url="" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.884997 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885002 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885005 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885009 2573 flags.go:64] FLAG: --max-pods="110" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885012 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885015 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885018 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885021 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885024 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885027 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885030 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885037 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885040 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885043 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885047 2573 flags.go:64] FLAG: --pod-cidr="" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885050 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885055 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885058 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885061 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885064 2573 flags.go:64] FLAG: --port="10250" Apr 20 13:31:14.888524 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885067 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885070 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fcab011b97beb268" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885073 2573 flags.go:64] FLAG: --qos-reserved="" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885076 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885081 2573 flags.go:64] FLAG: --register-node="true" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885084 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885087 2573 flags.go:64] FLAG: --register-with-taints="" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885090 2573 flags.go:64] FLAG: --registry-burst="10" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885093 2573 flags.go:64] FLAG: --registry-qps="5" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885096 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885099 2573 flags.go:64] FLAG: --reserved-memory="" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885103 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885108 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885111 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885114 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885116 2573 flags.go:64] FLAG: --runonce="false" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885119 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885123 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885126 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885129 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885132 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885135 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885138 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885141 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885144 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885147 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 13:31:14.889139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885149 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885153 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885156 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885159 2573 flags.go:64] FLAG: --system-cgroups="" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885161 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885167 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885169 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885172 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885176 2573 flags.go:64] FLAG: --tls-min-version="" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885179 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885183 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885186 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885189 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885192 2573 flags.go:64] FLAG: --v="2" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885197 2573 flags.go:64] FLAG: --version="false" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885200 2573 flags.go:64] FLAG: --vmodule="" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885205 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885208 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885301 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885304 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885307 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885311 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885315 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:31:14.889778 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885319 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885322 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885325 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885327 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885330 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885333 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885336 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885339 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885342 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885345 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885348 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885351 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885354 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885356 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885359 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885362 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885364 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885367 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885369 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885372 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:31:14.890406 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885374 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885377 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885379 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885382 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885384 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885387 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885389 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885392 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885395 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885397 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885400 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885402 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885405 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885409 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885412 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885415 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885418 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885420 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885423 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885426 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:31:14.890942 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885428 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885431 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885434 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885436 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885439 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885442 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885444 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885447 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885449 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885452 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885454 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885457 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885459 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885462 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885464 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885467 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885469 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885472 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885474 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885477 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:31:14.891432 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885479 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885482 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885485 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885487 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885490 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885492 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885495 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885497 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885500 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885502 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885505 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885507 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885510 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885512 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885515 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885518 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885520 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885523 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885525 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:31:14.891931 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885528 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:31:14.892392 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.885530 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:31:14.892392 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.885538 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 13:31:14.892488 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.892453 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 13:31:14.892488 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.892475 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 13:31:14.892540 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892525 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:31:14.892540 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892531 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:31:14.892540 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892535 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:31:14.892540 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892538 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:31:14.892540 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892541 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892545 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892547 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892550 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892553 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892556 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892559 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892562 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892565 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892569 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892573 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892577 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892580 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892583 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892586 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892589 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892593 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892596 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892598 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892601 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:31:14.892667 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892604 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892606 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892609 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892612 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892614 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892617 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892620 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892623 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892626 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892629 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892632 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892634 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892637 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892640 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892643 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892646 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892648 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892651 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892653 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892656 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:31:14.893173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892658 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892661 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892663 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892666 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892668 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892671 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892674 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892677 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892679 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892682 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892685 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892687 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892690 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892692 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892695 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892697 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892700 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892702 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892705 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:31:14.893654 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892708 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892713 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892716 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892719 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892722 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892725 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892727 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892730 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892733 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892736 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892738 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892741 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892743 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892759 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892764 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892768 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892772 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892775 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892778 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:31:14.894160 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892780 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892783 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892786 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892788 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.892794 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892895 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892900 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892903 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892906 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892909 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892912 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892916 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892921 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892924 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892928 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:31:14.894634 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892932 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892935 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892938 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892941 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892943 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892946 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892949 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892951 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892954 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892957 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892959 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892962 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892965 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892967 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892970 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892973 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892975 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892978 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892980 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892983 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:31:14.895011 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892985 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892988 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892990 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892993 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892995 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.892998 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893000 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893003 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893005 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893008 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893010 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893013 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893017 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893020 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893022 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893025 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893027 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893030 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893032 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893035 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:31:14.895501 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893037 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893040 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893042 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893045 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893047 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893049 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893052 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893054 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893057 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893060 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893062 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893064 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893067 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893069 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893072 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893074 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893077 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893079 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893082 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893084 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:31:14.895988 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893087 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893089 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893092 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893095 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893098 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893100 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893103 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893106 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893108 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893111 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893114 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893116 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893118 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893121 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893123 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:14.893126 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:31:14.896467 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.893131 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 13:31:14.896904 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.893915 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 13:31:14.896904 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.895983 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 13:31:14.896998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.896986 2573 server.go:1019] "Starting client certificate rotation" Apr 20 13:31:14.897093 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.897076 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 13:31:14.897123 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.897115 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 13:31:14.923046 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.923027 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 13:31:14.929620 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.929596 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 13:31:14.948424 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.948404 2573 log.go:25] "Validated CRI v1 runtime API" Apr 20 13:31:14.954261 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.954244 2573 log.go:25] "Validated CRI v1 image API" Apr 20 13:31:14.956317 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.956302 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 13:31:14.957466 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.957450 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 13:31:14.963495 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.963476 2573 fs.go:135] Filesystem UUIDs: map[2e5fc78c-5a0c-44e9-af82-ee29bfaf3b27:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 efec5b52-71d3-41c3-a4e1-4173da5bea4f:/dev/nvme0n1p3] Apr 20 13:31:14.963552 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.963496 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 13:31:14.969393 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.969287 2573 manager.go:217] Machine: {Timestamp:2026-04-20 13:31:14.967346439 +0000 UTC m=+0.497272810 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097206 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27d0d23757a42fa5d15205ea38fe56 SystemUUID:ec27d0d2-3757-a42f-a5d1-5205ea38fe56 BootID:912d1838-3765-41cc-9aca-830d02d751f8 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c7:7e:ed:98:61 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c7:7e:ed:98:61 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:36:4a:fa:59:eb:9a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 13:31:14.969393 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.969393 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 13:31:14.969496 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.969479 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 13:31:14.970997 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.970971 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 13:31:14.971145 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.970999 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-1.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 13:31:14.971187 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.971155 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 13:31:14.971187 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.971163 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 13:31:14.971187 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.971177 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 13:31:14.972074 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.972064 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 13:31:14.973419 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.973385 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 13:31:14.973502 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.973493 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 13:31:14.976242 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.976231 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 20 13:31:14.976283 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.976247 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 13:31:14.976283 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.976260 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 13:31:14.976283 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.976270 2573 kubelet.go:397] "Adding apiserver pod source" Apr 20 13:31:14.976283 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.976279 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 13:31:14.977412 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.977400 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 13:31:14.977452 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.977418 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 13:31:14.977860 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.977842 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7jss7" Apr 20 13:31:14.980871 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.980852 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 13:31:14.982685 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.982671 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 13:31:14.984049 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984034 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 13:31:14.984102 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984055 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 13:31:14.984102 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984069 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 13:31:14.984102 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984075 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 13:31:14.984102 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984081 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 13:31:14.984102 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984089 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 13:31:14.984102 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984095 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 13:31:14.984102 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984100 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 13:31:14.984418 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984110 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 13:31:14.984418 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984124 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 13:31:14.984418 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984143 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 13:31:14.984418 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.984152 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 13:31:14.985289 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.985277 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 13:31:14.985289 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.985289 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 13:31:14.988710 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.987886 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7jss7" Apr 20 13:31:14.988710 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:14.988317 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-1.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 13:31:14.988710 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:14.988366 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 13:31:14.989670 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.989651 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 13:31:14.989743 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.989711 2573 server.go:1295] "Started kubelet" Apr 20 13:31:14.989852 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.989825 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 13:31:14.989986 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.989941 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 13:31:14.990026 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.990014 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 13:31:14.990449 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.990434 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-1.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 13:31:14.990664 ip-10-0-133-1 systemd[1]: Started Kubernetes Kubelet. Apr 20 13:31:14.991233 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.991219 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 13:31:14.992351 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.992336 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 20 13:31:14.997243 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.997224 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 13:31:14.997820 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.997801 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 13:31:14.998529 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.998498 2573 factory.go:55] Registering systemd factory Apr 20 13:31:14.998529 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.998515 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 13:31:14.998529 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.998519 2573 factory.go:223] Registration of the systemd container factory successfully Apr 20 13:31:14.998694 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.998636 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 13:31:14.998694 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.998650 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 13:31:14.998791 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.998758 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 20 13:31:14.998791 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.998769 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 20 13:31:14.998991 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.998816 2573 factory.go:153] Registering CRI-O factory Apr 20 13:31:14.998991 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.998993 2573 factory.go:223] Registration of the crio container factory successfully Apr 20 13:31:14.999109 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:14.998895 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:14.999109 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.999080 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 13:31:14.999109 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.999107 2573 factory.go:103] Registering Raw factory Apr 20 13:31:14.999262 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.999120 2573 manager.go:1196] Started watching for new ooms in manager Apr 20 13:31:14.999623 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:14.999607 2573 manager.go:319] Starting recovery of all containers Apr 20 13:31:15.000540 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.000512 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:31:15.004695 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.004674 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-1.ec2.internal\" not found" node="ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.009768 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.009732 2573 manager.go:324] Recovery completed Apr 20 13:31:15.015397 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.015375 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:31:15.018107 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.018087 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:31:15.018193 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.018118 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:31:15.018193 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.018129 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:31:15.018644 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.018629 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 13:31:15.018644 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.018642 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 13:31:15.018779 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.018663 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 13:31:15.021168 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.021150 2573 policy_none.go:49] "None policy: Start" Apr 20 13:31:15.021255 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.021172 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 13:31:15.021255 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.021185 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.067363 2573 manager.go:341] "Starting Device Plugin manager" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.067402 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.067413 2573 server.go:85] "Starting device plugin registration server" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.067656 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.067666 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.067744 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.067887 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.067895 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.068378 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 13:31:15.069641 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.068411 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.128083 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.128005 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 13:31:15.129221 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.129200 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 13:31:15.129268 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.129231 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 13:31:15.129268 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.129255 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 13:31:15.129268 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.129266 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 13:31:15.129371 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.129306 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 13:31:15.132685 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.132667 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:31:15.168464 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.168439 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:31:15.169395 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.169376 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:31:15.169489 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.169412 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:31:15.169489 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.169427 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:31:15.169489 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.169458 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.178298 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.178273 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.178298 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.178300 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-1.ec2.internal\": node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.196421 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.196395 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.230346 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.230305 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal"] Apr 20 13:31:15.230458 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.230404 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:31:15.232745 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.232730 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:31:15.232829 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.232774 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:31:15.232829 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.232785 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:31:15.234246 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.234233 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:31:15.234406 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.234392 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.234441 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.234421 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:31:15.240564 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.240543 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:31:15.240564 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.240552 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:31:15.240693 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.240574 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:31:15.240693 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.240578 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:31:15.240693 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.240587 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:31:15.240827 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.240589 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:31:15.241931 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.241916 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.241980 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.241956 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:31:15.243364 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.243348 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:31:15.243427 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.243379 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:31:15.243427 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.243394 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:31:15.257273 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.257252 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-1.ec2.internal\" not found" node="ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.260927 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.260913 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-1.ec2.internal\" not found" node="ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.296507 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.296469 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.301147 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.301130 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b85f0afc60480346cf8546c53d0560e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal\" (UID: \"b85f0afc60480346cf8546c53d0560e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.301203 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.301157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b85f0afc60480346cf8546c53d0560e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal\" (UID: \"b85f0afc60480346cf8546c53d0560e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.301203 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.301175 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/31008c883fac4d80f804d5217a8035e0-config\") pod \"kube-apiserver-proxy-ip-10-0-133-1.ec2.internal\" (UID: \"31008c883fac4d80f804d5217a8035e0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.396914 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.396842 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.402207 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.402187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b85f0afc60480346cf8546c53d0560e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal\" (UID: \"b85f0afc60480346cf8546c53d0560e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.402256 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.402215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b85f0afc60480346cf8546c53d0560e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal\" (UID: \"b85f0afc60480346cf8546c53d0560e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.402256 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.402235 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/31008c883fac4d80f804d5217a8035e0-config\") pod \"kube-apiserver-proxy-ip-10-0-133-1.ec2.internal\" (UID: \"31008c883fac4d80f804d5217a8035e0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.402315 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.402295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/31008c883fac4d80f804d5217a8035e0-config\") pod \"kube-apiserver-proxy-ip-10-0-133-1.ec2.internal\" (UID: \"31008c883fac4d80f804d5217a8035e0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.402347 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.402312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b85f0afc60480346cf8546c53d0560e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal\" (UID: \"b85f0afc60480346cf8546c53d0560e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.402347 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.402318 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b85f0afc60480346cf8546c53d0560e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal\" (UID: \"b85f0afc60480346cf8546c53d0560e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.497629 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.497584 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.558999 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.558946 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.562882 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.562863 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal" Apr 20 13:31:15.598582 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.598555 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.699041 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.698951 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.799422 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.799387 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.896719 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.896688 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 13:31:15.897359 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.896866 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 13:31:15.897359 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.896896 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 13:31:15.899903 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:15.899884 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:15.990670 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.990630 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 13:26:14 +0000 UTC" deadline="2027-10-11 18:53:27.394367122 +0000 UTC" Apr 20 13:31:15.990670 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.990665 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12941h22m11.403705612s" Apr 20 13:31:15.997782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:15.997759 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 13:31:16.000795 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:16.000776 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:16.008044 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.008023 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 13:31:16.031156 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.031132 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jmtcx" Apr 20 13:31:16.037634 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.037615 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jmtcx" Apr 20 13:31:16.068526 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:16.068495 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31008c883fac4d80f804d5217a8035e0.slice/crio-21a10343464f8309cb9aedd6700fd7209bf8f0403ca4af12899ae35dd93647bd WatchSource:0}: Error finding container 21a10343464f8309cb9aedd6700fd7209bf8f0403ca4af12899ae35dd93647bd: Status 404 returned error can't find the container with id 21a10343464f8309cb9aedd6700fd7209bf8f0403ca4af12899ae35dd93647bd Apr 20 13:31:16.068739 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:16.068720 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85f0afc60480346cf8546c53d0560e5.slice/crio-caf3cbba70faaed9aa14dd2484a88bc199433521b384790c1a09b854f5ab90b5 WatchSource:0}: Error finding container caf3cbba70faaed9aa14dd2484a88bc199433521b384790c1a09b854f5ab90b5: Status 404 returned error can't find the container with id caf3cbba70faaed9aa14dd2484a88bc199433521b384790c1a09b854f5ab90b5 Apr 20 13:31:16.074729 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.074711 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 13:31:16.101821 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:16.101801 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:16.132822 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.132769 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal" event={"ID":"31008c883fac4d80f804d5217a8035e0","Type":"ContainerStarted","Data":"21a10343464f8309cb9aedd6700fd7209bf8f0403ca4af12899ae35dd93647bd"} Apr 20 13:31:16.133740 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.133720 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" event={"ID":"b85f0afc60480346cf8546c53d0560e5","Type":"ContainerStarted","Data":"caf3cbba70faaed9aa14dd2484a88bc199433521b384790c1a09b854f5ab90b5"} Apr 20 13:31:16.202054 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:16.202027 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-1.ec2.internal\" not found" Apr 20 13:31:16.209854 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.209831 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:31:16.298219 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.298197 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" Apr 20 13:31:16.313013 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.312994 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 13:31:16.315481 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.315468 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal" Apr 20 13:31:16.323222 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.323204 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 13:31:16.434399 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.434375 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:31:16.702399 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.702154 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:31:16.977796 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.977712 2573 apiserver.go:52] "Watching apiserver" Apr 20 13:31:16.987505 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.987483 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 13:31:16.988521 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.988496 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-mfqvp","openshift-ovn-kubernetes/ovnkube-node-qtnxb","kube-system/konnectivity-agent-xnv4k","kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg","openshift-cluster-node-tuning-operator/tuned-48ws4","openshift-dns/node-resolver-75q77","openshift-multus/multus-s9jnj","openshift-multus/network-metrics-daemon-55n9j","openshift-network-operator/iptables-alerter-v9qp2","openshift-image-registry/node-ca-qvdjn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal","openshift-multus/multus-additional-cni-plugins-bwnwj"] Apr 20 13:31:16.989984 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.989964 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:16.991354 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.991331 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:16.992580 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.992549 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 13:31:16.992677 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.992593 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:16.992677 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.992549 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:31:16.992677 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.992632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-z5xx8\"" Apr 20 13:31:16.992986 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.992959 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 13:31:16.993565 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.993541 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 13:31:16.993971 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.993688 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-phdfg\"" Apr 20 13:31:16.993971 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.993709 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 13:31:16.993971 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.993799 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 13:31:16.993971 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.993835 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 13:31:16.994200 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.993976 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 13:31:16.994467 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.994450 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:16.994742 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.994679 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 13:31:16.995069 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.994948 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 13:31:16.995069 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.994985 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gdjnq\"" Apr 20 13:31:16.995069 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.995030 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 13:31:16.995964 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.995924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:16.996454 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.996439 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 13:31:16.996548 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.996522 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kmvbg\"" Apr 20 13:31:16.996781 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.996761 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 13:31:16.996882 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.996790 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 13:31:16.997170 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.997153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:16.997931 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.997899 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:31:16.998007 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.997941 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 13:31:16.998007 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.997967 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9q7gl\"" Apr 20 13:31:16.998600 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.998339 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s9jnj" Apr 20 13:31:16.999041 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.999005 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 13:31:16.999205 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.999166 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d66cx\"" Apr 20 13:31:16.999205 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:16.999169 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 13:31:17.000379 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.000349 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mdmhs\"" Apr 20 13:31:17.000460 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.000360 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 13:31:17.000556 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.000461 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:17.000556 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.000542 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:17.000770 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.000737 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 13:31:17.000826 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.000777 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 13:31:17.000826 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.000800 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 13:31:17.001654 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.001634 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:17.001738 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.001701 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:17.003373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.003351 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.004187 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.004170 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.006088 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.005884 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 13:31:17.006088 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.005894 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 13:31:17.006088 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.005966 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 13:31:17.006088 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.005991 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-54ggv\"" Apr 20 13:31:17.006328 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.006241 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 13:31:17.006491 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.006335 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 13:31:17.006491 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.006396 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gchtz\"" Apr 20 13:31:17.010500 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c062d4f4-2415-4685-915a-14cbd0991ab3-konnectivity-ca\") pod \"konnectivity-agent-xnv4k\" (UID: \"c062d4f4-2415-4685-915a-14cbd0991ab3\") " pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:17.010592 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010514 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-cni-dir\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.010592 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010542 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.010592 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pz9m\" (UniqueName: \"kubernetes.io/projected/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-kube-api-access-4pz9m\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:17.010732 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-var-lib-kubelet\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.010732 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-log-socket\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.010860 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/884c5a2b-9d81-40ae-a58b-9b1298785f9b-ovn-node-metrics-cert\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.010916 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.010970 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a929d3eb-e608-498a-85b2-7ac9ff81b424-iptables-alerter-script\") pod \"iptables-alerter-v9qp2\" (UID: \"a929d3eb-e608-498a-85b2-7ac9ff81b424\") " pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.011292 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.010971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-systemd-units\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.011377 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.011332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-run-openvswitch\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.011434 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.011376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c062d4f4-2415-4685-915a-14cbd0991ab3-agent-certs\") pod \"konnectivity-agent-xnv4k\" (UID: \"c062d4f4-2415-4685-915a-14cbd0991ab3\") " pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:17.011489 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.011432 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-sys-fs\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.011489 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.011470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlstt\" (UniqueName: \"kubernetes.io/projected/e0388411-4485-4a66-9511-1c06b60790d7-kube-api-access-jlstt\") pod \"node-resolver-75q77\" (UID: \"e0388411-4485-4a66-9511-1c06b60790d7\") " pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.011584 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.011511 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwz58\" (UniqueName: \"kubernetes.io/projected/fe989244-9412-433d-9a95-1acaacc7f0cb-kube-api-access-wwz58\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.011696 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.011670 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbms6\" (UniqueName: \"kubernetes.io/projected/a929d3eb-e608-498a-85b2-7ac9ff81b424-kube-api-access-kbms6\") pod \"iptables-alerter-v9qp2\" (UID: \"a929d3eb-e608-498a-85b2-7ac9ff81b424\") " pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.011806 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.011727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-cni-bin\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8z28\" (UniqueName: \"kubernetes.io/projected/884c5a2b-9d81-40ae-a58b-9b1298785f9b-kube-api-access-n8z28\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012263 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-etc-kubernetes\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slcnv\" (UniqueName: \"kubernetes.io/projected/d1caa7df-6d09-474b-b1e5-e18a510edd97-kube-api-access-slcnv\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-var-lib-openvswitch\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-etc-selinux\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012374 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-cnibin\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d1caa7df-6d09-474b-b1e5-e18a510edd97-cni-binary-copy\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-conf-dir\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012462 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-run-multus-certs\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-run-netns\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-run\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-tuned\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-registration-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-var-lib-kubelet\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012767 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-slash\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012794 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.013373 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012822 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-cni-netd\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-system-cni-dir\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-var-lib-cni-bin\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-var-lib-cni-multus\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-daemon-config\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.012975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-socket-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013004 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-run-netns\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013059 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-lib-modules\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-kubelet\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013133 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-socket-dir-parent\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013213 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-sys\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013260 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-etc-openvswitch\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013289 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nq86\" (UniqueName: \"kubernetes.io/projected/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-kube-api-access-6nq86\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013325 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-systemd\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013354 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/884c5a2b-9d81-40ae-a58b-9b1298785f9b-ovnkube-script-lib\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.014239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013401 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e0388411-4485-4a66-9511-1c06b60790d7-hosts-file\") pod \"node-resolver-75q77\" (UID: \"e0388411-4485-4a66-9511-1c06b60790d7\") " pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-modprobe-d\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013476 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-sysconfig\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013508 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-sysctl-d\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe989244-9412-433d-9a95-1acaacc7f0cb-tmp\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.013557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/884c5a2b-9d81-40ae-a58b-9b1298785f9b-ovnkube-config\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.014631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-device-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.014689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0388411-4485-4a66-9511-1c06b60790d7-tmp-dir\") pod \"node-resolver-75q77\" (UID: \"e0388411-4485-4a66-9511-1c06b60790d7\") " pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.014717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-hostroot\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.014771 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-sysctl-conf\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.014805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-run-systemd\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.014874 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/884c5a2b-9d81-40ae-a58b-9b1298785f9b-env-overrides\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.014977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.014953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-os-release\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.015477 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.014999 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-run-k8s-cni-cncf-io\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.015477 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.015036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-kubernetes\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.015477 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.015128 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-host\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.015477 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.015164 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a929d3eb-e608-498a-85b2-7ac9ff81b424-host-slash\") pod \"iptables-alerter-v9qp2\" (UID: \"a929d3eb-e608-498a-85b2-7ac9ff81b424\") " pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.015477 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.015209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-run-ovn\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.015477 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.015280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-node-log\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.038365 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.038340 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 13:26:16 +0000 UTC" deadline="2027-12-17 23:18:44.22823044 +0000 UTC" Apr 20 13:31:17.038365 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.038364 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14553h47m27.189868965s" Apr 20 13:31:17.100137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.100104 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 13:31:17.116315 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-var-lib-cni-bin\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.116315 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-var-lib-cni-multus\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.116514 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-daemon-config\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.116514 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-var-lib-cni-bin\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.116514 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-var-lib-cni-multus\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.116514 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-cnibin\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.116514 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116501 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-socket-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-run-netns\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-lib-modules\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-kubelet\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-socket-dir-parent\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-socket-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116679 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-sys\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-etc-openvswitch\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.116782 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nq86\" (UniqueName: \"kubernetes.io/projected/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-kube-api-access-6nq86\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116795 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9xw5\" (UniqueName: \"kubernetes.io/projected/650b481f-9321-4709-a40c-b7e7ad6e6429-kube-api-access-t9xw5\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-systemd\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/884c5a2b-9d81-40ae-a58b-9b1298785f9b-ovnkube-script-lib\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e0388411-4485-4a66-9511-1c06b60790d7-hosts-file\") pod \"node-resolver-75q77\" (UID: \"e0388411-4485-4a66-9511-1c06b60790d7\") " pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/650b481f-9321-4709-a40c-b7e7ad6e6429-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116928 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-modprobe-d\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116956 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-sysconfig\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.116986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-sysctl-d\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe989244-9412-433d-9a95-1acaacc7f0cb-tmp\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/884c5a2b-9d81-40ae-a58b-9b1298785f9b-ovnkube-config\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117056 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-device-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-daemon-config\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0388411-4485-4a66-9511-1c06b60790d7-tmp-dir\") pod \"node-resolver-75q77\" (UID: \"e0388411-4485-4a66-9511-1c06b60790d7\") " pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-hostroot\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-sysctl-conf\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-run-systemd\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.117241 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117171 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-kubelet\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/884c5a2b-9d81-40ae-a58b-9b1298785f9b-env-overrides\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-run-netns\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117224 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-os-release\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117266 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-run-k8s-cni-cncf-io\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-os-release\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117362 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-systemd\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-socket-dir-parent\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117415 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-sysconfig\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-sysctl-d\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.117644 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.117720 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs podName:e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.617684692 +0000 UTC m=+3.147611053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs") pod "network-metrics-daemon-55n9j" (UID: "e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117849 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117893 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/884c5a2b-9d81-40ae-a58b-9b1298785f9b-ovnkube-script-lib\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117933 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-sys\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117933 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-system-cni-dir\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117991 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-kubernetes\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.117998 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.117997 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-run-systemd\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-host\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a929d3eb-e608-498a-85b2-7ac9ff81b424-host-slash\") pod \"iptables-alerter-v9qp2\" (UID: \"a929d3eb-e608-498a-85b2-7ac9ff81b424\") " pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-run-ovn\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-node-log\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c062d4f4-2415-4685-915a-14cbd0991ab3-konnectivity-ca\") pod \"konnectivity-agent-xnv4k\" (UID: \"c062d4f4-2415-4685-915a-14cbd0991ab3\") " pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-cni-dir\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-kubernetes\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118200 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-os-release\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118230 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118236 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-run-ovn\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118260 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/650b481f-9321-4709-a40c-b7e7ad6e6429-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-etc-openvswitch\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pz9m\" (UniqueName: \"kubernetes.io/projected/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-kube-api-access-4pz9m\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-cni-dir\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/884c5a2b-9d81-40ae-a58b-9b1298785f9b-ovnkube-config\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-host\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.118845 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-node-log\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a929d3eb-e608-498a-85b2-7ac9ff81b424-host-slash\") pod \"iptables-alerter-v9qp2\" (UID: \"a929d3eb-e608-498a-85b2-7ac9ff81b424\") " pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-var-lib-kubelet\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-log-socket\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118550 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/884c5a2b-9d81-40ae-a58b-9b1298785f9b-ovn-node-metrics-cert\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a929d3eb-e608-498a-85b2-7ac9ff81b424-iptables-alerter-script\") pod \"iptables-alerter-v9qp2\" (UID: \"a929d3eb-e608-498a-85b2-7ac9ff81b424\") " pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118614 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-systemd-units\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-run-openvswitch\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118684 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/884c5a2b-9d81-40ae-a58b-9b1298785f9b-env-overrides\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c062d4f4-2415-4685-915a-14cbd0991ab3-agent-certs\") pod \"konnectivity-agent-xnv4k\" (UID: \"c062d4f4-2415-4685-915a-14cbd0991ab3\") " pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-sys-fs\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118766 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e0388411-4485-4a66-9511-1c06b60790d7-hosts-file\") pod \"node-resolver-75q77\" (UID: \"e0388411-4485-4a66-9511-1c06b60790d7\") " pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118797 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlstt\" (UniqueName: \"kubernetes.io/projected/e0388411-4485-4a66-9511-1c06b60790d7-kube-api-access-jlstt\") pod \"node-resolver-75q77\" (UID: \"e0388411-4485-4a66-9511-1c06b60790d7\") " pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118818 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-sys-fs\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118829 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d282e4a7-f5fa-463a-a056-646bf858c554-serviceca\") pod \"node-ca-qvdjn\" (UID: \"d282e4a7-f5fa-463a-a056-646bf858c554\") " pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwz58\" (UniqueName: \"kubernetes.io/projected/fe989244-9412-433d-9a95-1acaacc7f0cb-kube-api-access-wwz58\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.119651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbms6\" (UniqueName: \"kubernetes.io/projected/a929d3eb-e608-498a-85b2-7ac9ff81b424-kube-api-access-kbms6\") pod \"iptables-alerter-v9qp2\" (UID: \"a929d3eb-e608-498a-85b2-7ac9ff81b424\") " pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118917 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-cni-bin\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-sysctl-conf\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8z28\" (UniqueName: \"kubernetes.io/projected/884c5a2b-9d81-40ae-a58b-9b1298785f9b-kube-api-access-n8z28\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118958 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c062d4f4-2415-4685-915a-14cbd0991ab3-konnectivity-ca\") pod \"konnectivity-agent-xnv4k\" (UID: \"c062d4f4-2415-4685-915a-14cbd0991ab3\") " pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118967 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-etc-kubernetes\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.118992 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-lib-modules\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-etc-kubernetes\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-cni-bin\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-run-openvswitch\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slcnv\" (UniqueName: \"kubernetes.io/projected/d1caa7df-6d09-474b-b1e5-e18a510edd97-kube-api-access-slcnv\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119149 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-log-socket\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-systemd-units\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119192 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-device-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119150 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/650b481f-9321-4709-a40c-b7e7ad6e6429-cni-binary-copy\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-var-lib-kubelet\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.120443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-var-lib-openvswitch\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119236 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a929d3eb-e608-498a-85b2-7ac9ff81b424-iptables-alerter-script\") pod \"iptables-alerter-v9qp2\" (UID: \"a929d3eb-e608-498a-85b2-7ac9ff81b424\") " pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-etc-selinux\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-run-k8s-cni-cncf-io\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119116 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-hostroot\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-var-lib-openvswitch\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119420 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-modprobe-d\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0388411-4485-4a66-9511-1c06b60790d7-tmp-dir\") pod \"node-resolver-75q77\" (UID: \"e0388411-4485-4a66-9511-1c06b60790d7\") " pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-cnibin\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119601 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-cnibin\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d1caa7df-6d09-474b-b1e5-e18a510edd97-cni-binary-copy\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119644 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-etc-selinux\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-conf-dir\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-run-multus-certs\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119725 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-run-netns\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-multus-conf-dir\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-run-multus-certs\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119775 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d282e4a7-f5fa-463a-a056-646bf858c554-host\") pod \"node-ca-qvdjn\" (UID: \"d282e4a7-f5fa-463a-a056-646bf858c554\") " pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-run-netns\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wgt4\" (UniqueName: \"kubernetes.io/projected/d282e4a7-f5fa-463a-a056-646bf858c554-kube-api-access-9wgt4\") pod \"node-ca-qvdjn\" (UID: \"d282e4a7-f5fa-463a-a056-646bf858c554\") " pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-run\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-tuned\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.119960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fe989244-9412-433d-9a95-1acaacc7f0cb-run\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-registration-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-var-lib-kubelet\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-registration-dir\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d1caa7df-6d09-474b-b1e5-e18a510edd97-cni-binary-copy\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-host-var-lib-kubelet\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-slash\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-slash\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120259 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120311 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-run-ovn-kubernetes\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120344 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-cni-netd\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-system-cni-dir\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1caa7df-6d09-474b-b1e5-e18a510edd97-system-cni-dir\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.121838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.120537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/884c5a2b-9d81-40ae-a58b-9b1298785f9b-host-cni-netd\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.122500 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.121490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fe989244-9412-433d-9a95-1acaacc7f0cb-tmp\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.122500 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.121588 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/884c5a2b-9d81-40ae-a58b-9b1298785f9b-ovn-node-metrics-cert\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.122500 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.122240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c062d4f4-2415-4685-915a-14cbd0991ab3-agent-certs\") pod \"konnectivity-agent-xnv4k\" (UID: \"c062d4f4-2415-4685-915a-14cbd0991ab3\") " pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:17.122500 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.122405 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fe989244-9412-433d-9a95-1acaacc7f0cb-etc-tuned\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.126219 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.126187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nq86\" (UniqueName: \"kubernetes.io/projected/3ed7a5eb-6816-49f2-bd12-bed6a3be081f-kube-api-access-6nq86\") pod \"aws-ebs-csi-driver-node-nvzpg\" (UID: \"3ed7a5eb-6816-49f2-bd12-bed6a3be081f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.128050 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.128007 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slcnv\" (UniqueName: \"kubernetes.io/projected/d1caa7df-6d09-474b-b1e5-e18a510edd97-kube-api-access-slcnv\") pod \"multus-s9jnj\" (UID: \"d1caa7df-6d09-474b-b1e5-e18a510edd97\") " pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.128548 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.128524 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlstt\" (UniqueName: \"kubernetes.io/projected/e0388411-4485-4a66-9511-1c06b60790d7-kube-api-access-jlstt\") pod \"node-resolver-75q77\" (UID: \"e0388411-4485-4a66-9511-1c06b60790d7\") " pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.128744 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.128716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8z28\" (UniqueName: \"kubernetes.io/projected/884c5a2b-9d81-40ae-a58b-9b1298785f9b-kube-api-access-n8z28\") pod \"ovnkube-node-qtnxb\" (UID: \"884c5a2b-9d81-40ae-a58b-9b1298785f9b\") " pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.129303 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.129281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwz58\" (UniqueName: \"kubernetes.io/projected/fe989244-9412-433d-9a95-1acaacc7f0cb-kube-api-access-wwz58\") pod \"tuned-48ws4\" (UID: \"fe989244-9412-433d-9a95-1acaacc7f0cb\") " pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.129492 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.129470 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbms6\" (UniqueName: \"kubernetes.io/projected/a929d3eb-e608-498a-85b2-7ac9ff81b424-kube-api-access-kbms6\") pod \"iptables-alerter-v9qp2\" (UID: \"a929d3eb-e608-498a-85b2-7ac9ff81b424\") " pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.129570 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.129531 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pz9m\" (UniqueName: \"kubernetes.io/projected/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-kube-api-access-4pz9m\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:17.133683 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.133666 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:31:17.221619 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221581 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9xw5\" (UniqueName: \"kubernetes.io/projected/650b481f-9321-4709-a40c-b7e7ad6e6429-kube-api-access-t9xw5\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.221619 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/650b481f-9321-4709-a40c-b7e7ad6e6429-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.221841 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:17.221841 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-system-cni-dir\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.221841 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-os-release\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.221841 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-system-cni-dir\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222051 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/650b481f-9321-4709-a40c-b7e7ad6e6429-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222051 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d282e4a7-f5fa-463a-a056-646bf858c554-serviceca\") pod \"node-ca-qvdjn\" (UID: \"d282e4a7-f5fa-463a-a056-646bf858c554\") " pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.222051 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221951 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/650b481f-9321-4709-a40c-b7e7ad6e6429-cni-binary-copy\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222051 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.221978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d282e4a7-f5fa-463a-a056-646bf858c554-host\") pod \"node-ca-qvdjn\" (UID: \"d282e4a7-f5fa-463a-a056-646bf858c554\") " pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.222051 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wgt4\" (UniqueName: \"kubernetes.io/projected/d282e4a7-f5fa-463a-a056-646bf858c554-kube-api-access-9wgt4\") pod \"node-ca-qvdjn\" (UID: \"d282e4a7-f5fa-463a-a056-646bf858c554\") " pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.222051 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222005 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-os-release\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222366 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222058 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-cnibin\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222366 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-cnibin\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222366 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222366 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222062 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d282e4a7-f5fa-463a-a056-646bf858c554-host\") pod \"node-ca-qvdjn\" (UID: \"d282e4a7-f5fa-463a-a056-646bf858c554\") " pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.222366 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/650b481f-9321-4709-a40c-b7e7ad6e6429-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222366 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222320 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/650b481f-9321-4709-a40c-b7e7ad6e6429-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222641 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/650b481f-9321-4709-a40c-b7e7ad6e6429-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222641 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/650b481f-9321-4709-a40c-b7e7ad6e6429-cni-binary-copy\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.222895 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.222869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d282e4a7-f5fa-463a-a056-646bf858c554-serviceca\") pod \"node-ca-qvdjn\" (UID: \"d282e4a7-f5fa-463a-a056-646bf858c554\") " pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.227426 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.227405 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:17.227426 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.227424 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:17.227426 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.227434 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cszcv for pod openshift-network-diagnostics/network-check-target-mfqvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:17.227640 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.227490 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv podName:e2b1c838-35ef-4d7c-898c-5604961fd9aa nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.727474194 +0000 UTC m=+3.257400552 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cszcv" (UniqueName: "kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv") pod "network-check-target-mfqvp" (UID: "e2b1c838-35ef-4d7c-898c-5604961fd9aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:17.230989 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.230966 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9xw5\" (UniqueName: \"kubernetes.io/projected/650b481f-9321-4709-a40c-b7e7ad6e6429-kube-api-access-t9xw5\") pod \"multus-additional-cni-plugins-bwnwj\" (UID: \"650b481f-9321-4709-a40c-b7e7ad6e6429\") " pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.231088 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.231034 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wgt4\" (UniqueName: \"kubernetes.io/projected/d282e4a7-f5fa-463a-a056-646bf858c554-kube-api-access-9wgt4\") pod \"node-ca-qvdjn\" (UID: \"d282e4a7-f5fa-463a-a056-646bf858c554\") " pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.303535 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.303500 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v9qp2" Apr 20 13:31:17.310344 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.310325 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:17.319098 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.319075 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:17.325560 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.325541 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" Apr 20 13:31:17.332176 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.332146 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-48ws4" Apr 20 13:31:17.339743 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.339724 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-75q77" Apr 20 13:31:17.346408 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.346389 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s9jnj" Apr 20 13:31:17.352900 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.352877 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qvdjn" Apr 20 13:31:17.358387 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.358364 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" Apr 20 13:31:17.624115 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.623967 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:17.624115 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.624109 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:17.624335 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.624173 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs podName:e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:18.624154896 +0000 UTC m=+4.154081259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs") pod "network-metrics-daemon-55n9j" (UID: "e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:17.695458 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:17.695307 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd282e4a7_f5fa_463a_a056_646bf858c554.slice/crio-75fca560dae59400944684ae205a6ee9839c89f2aad171ec7180721eb060d317 WatchSource:0}: Error finding container 75fca560dae59400944684ae205a6ee9839c89f2aad171ec7180721eb060d317: Status 404 returned error can't find the container with id 75fca560dae59400944684ae205a6ee9839c89f2aad171ec7180721eb060d317 Apr 20 13:31:17.696333 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:17.696307 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod650b481f_9321_4709_a40c_b7e7ad6e6429.slice/crio-063be3e72bc513e80f0cf42605a4c79a65313b1e56beade21960c342041876a2 WatchSource:0}: Error finding container 063be3e72bc513e80f0cf42605a4c79a65313b1e56beade21960c342041876a2: Status 404 returned error can't find the container with id 063be3e72bc513e80f0cf42605a4c79a65313b1e56beade21960c342041876a2 Apr 20 13:31:17.697562 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:17.697193 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed7a5eb_6816_49f2_bd12_bed6a3be081f.slice/crio-aac21864861740f9483af81a063d2924f169c6fcb4390187073c38b74195a94b WatchSource:0}: Error finding container aac21864861740f9483af81a063d2924f169c6fcb4390187073c38b74195a94b: Status 404 returned error can't find the container with id aac21864861740f9483af81a063d2924f169c6fcb4390187073c38b74195a94b Apr 20 13:31:17.697941 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:17.697917 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0388411_4485_4a66_9511_1c06b60790d7.slice/crio-8c80b9944531d87f2926cf14b8553979780624469335a71a4a2f54d87de3e1c2 WatchSource:0}: Error finding container 8c80b9944531d87f2926cf14b8553979780624469335a71a4a2f54d87de3e1c2: Status 404 returned error can't find the container with id 8c80b9944531d87f2926cf14b8553979780624469335a71a4a2f54d87de3e1c2 Apr 20 13:31:17.705016 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:17.704913 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe989244_9412_433d_9a95_1acaacc7f0cb.slice/crio-c4d6367d97062d304e6c2a6b7fe06f35b09252cd95255e78741ba57a6da73de4 WatchSource:0}: Error finding container c4d6367d97062d304e6c2a6b7fe06f35b09252cd95255e78741ba57a6da73de4: Status 404 returned error can't find the container with id c4d6367d97062d304e6c2a6b7fe06f35b09252cd95255e78741ba57a6da73de4 Apr 20 13:31:17.705831 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:17.705797 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda929d3eb_e608_498a_85b2_7ac9ff81b424.slice/crio-249445fc25598a45b2cc88f95fbd6251935e8f7daa0ca497cc47b1512abc81ed WatchSource:0}: Error finding container 249445fc25598a45b2cc88f95fbd6251935e8f7daa0ca497cc47b1512abc81ed: Status 404 returned error can't find the container with id 249445fc25598a45b2cc88f95fbd6251935e8f7daa0ca497cc47b1512abc81ed Apr 20 13:31:17.706798 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:17.706655 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod884c5a2b_9d81_40ae_a58b_9b1298785f9b.slice/crio-a7e966b596f908ea6b6c3c68c665d9c87f5e4940523faeb89ca26e09670adb0c WatchSource:0}: Error finding container a7e966b596f908ea6b6c3c68c665d9c87f5e4940523faeb89ca26e09670adb0c: Status 404 returned error can't find the container with id a7e966b596f908ea6b6c3c68c665d9c87f5e4940523faeb89ca26e09670adb0c Apr 20 13:31:17.824690 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:17.824660 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:17.824852 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.824833 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:17.824891 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.824856 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:17.824891 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.824867 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cszcv for pod openshift-network-diagnostics/network-check-target-mfqvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:17.824975 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:17.824915 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv podName:e2b1c838-35ef-4d7c-898c-5604961fd9aa nodeName:}" failed. No retries permitted until 2026-04-20 13:31:18.82490109 +0000 UTC m=+4.354827452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cszcv" (UniqueName: "kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv") pod "network-check-target-mfqvp" (UID: "e2b1c838-35ef-4d7c-898c-5604961fd9aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:18.038824 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.038783 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 13:26:16 +0000 UTC" deadline="2027-12-13 15:34:52.634799762 +0000 UTC" Apr 20 13:31:18.038824 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.038819 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14450h3m34.59598396s" Apr 20 13:31:18.129892 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.129859 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:18.130071 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:18.130005 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:18.146923 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.146863 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xnv4k" event={"ID":"c062d4f4-2415-4685-915a-14cbd0991ab3","Type":"ContainerStarted","Data":"79e3dbb41c923999cbd32c0062f7960a692c8bc84bd14cbcab5c2458f6b381f1"} Apr 20 13:31:18.153428 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.153392 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-48ws4" event={"ID":"fe989244-9412-433d-9a95-1acaacc7f0cb","Type":"ContainerStarted","Data":"c4d6367d97062d304e6c2a6b7fe06f35b09252cd95255e78741ba57a6da73de4"} Apr 20 13:31:18.157360 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.157312 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s9jnj" event={"ID":"d1caa7df-6d09-474b-b1e5-e18a510edd97","Type":"ContainerStarted","Data":"639efb70f6ded1230dc3c564f6d2f540c3ac468fbba78a555ed7b62dd2829b77"} Apr 20 13:31:18.174325 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.174276 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-75q77" event={"ID":"e0388411-4485-4a66-9511-1c06b60790d7","Type":"ContainerStarted","Data":"8c80b9944531d87f2926cf14b8553979780624469335a71a4a2f54d87de3e1c2"} Apr 20 13:31:18.180137 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.180083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" event={"ID":"3ed7a5eb-6816-49f2-bd12-bed6a3be081f","Type":"ContainerStarted","Data":"aac21864861740f9483af81a063d2924f169c6fcb4390187073c38b74195a94b"} Apr 20 13:31:18.183820 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.183740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" event={"ID":"650b481f-9321-4709-a40c-b7e7ad6e6429","Type":"ContainerStarted","Data":"063be3e72bc513e80f0cf42605a4c79a65313b1e56beade21960c342041876a2"} Apr 20 13:31:18.192994 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.192937 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qvdjn" event={"ID":"d282e4a7-f5fa-463a-a056-646bf858c554","Type":"ContainerStarted","Data":"75fca560dae59400944684ae205a6ee9839c89f2aad171ec7180721eb060d317"} Apr 20 13:31:18.202825 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.202085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal" event={"ID":"31008c883fac4d80f804d5217a8035e0","Type":"ContainerStarted","Data":"d15b2acddb53c64f175f4c7037958facfccc2d148f9bf27a3c376a8413bd5573"} Apr 20 13:31:18.211987 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.211926 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerStarted","Data":"a7e966b596f908ea6b6c3c68c665d9c87f5e4940523faeb89ca26e09670adb0c"} Apr 20 13:31:18.222224 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.222162 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-1.ec2.internal" podStartSLOduration=2.222146721 podStartE2EDuration="2.222146721s" podCreationTimestamp="2026-04-20 13:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:31:18.221133495 +0000 UTC m=+3.751059877" watchObservedRunningTime="2026-04-20 13:31:18.222146721 +0000 UTC m=+3.752073104" Apr 20 13:31:18.225352 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.225309 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v9qp2" event={"ID":"a929d3eb-e608-498a-85b2-7ac9ff81b424","Type":"ContainerStarted","Data":"249445fc25598a45b2cc88f95fbd6251935e8f7daa0ca497cc47b1512abc81ed"} Apr 20 13:31:18.631198 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.631168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:18.631478 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:18.631385 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:18.631478 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:18.631453 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs podName:e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:20.631424519 +0000 UTC m=+6.161350884 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs") pod "network-metrics-daemon-55n9j" (UID: "e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:18.833371 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:18.832767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:18.833371 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:18.832896 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:18.833371 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:18.832924 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:18.833371 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:18.832937 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cszcv for pod openshift-network-diagnostics/network-check-target-mfqvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:18.833371 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:18.832998 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv podName:e2b1c838-35ef-4d7c-898c-5604961fd9aa nodeName:}" failed. No retries permitted until 2026-04-20 13:31:20.832979447 +0000 UTC m=+6.362905811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cszcv" (UniqueName: "kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv") pod "network-check-target-mfqvp" (UID: "e2b1c838-35ef-4d7c-898c-5604961fd9aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:19.132898 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:19.132765 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:19.132898 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:19.132887 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:19.255783 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:19.255161 2573 generic.go:358] "Generic (PLEG): container finished" podID="b85f0afc60480346cf8546c53d0560e5" containerID="7aa6e87b537bfd8572ff0e2bf4cb1f34bab4f922c784f36ed67f9a485611c2f0" exitCode=0 Apr 20 13:31:19.255783 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:19.255703 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" event={"ID":"b85f0afc60480346cf8546c53d0560e5","Type":"ContainerDied","Data":"7aa6e87b537bfd8572ff0e2bf4cb1f34bab4f922c784f36ed67f9a485611c2f0"} Apr 20 13:31:20.129797 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:20.129761 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:20.129986 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:20.129911 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:20.261320 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:20.260660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" event={"ID":"b85f0afc60480346cf8546c53d0560e5","Type":"ContainerStarted","Data":"0dfbfcd3d012bd93c4ccaeb7575611117e4740ab82d824cba8a08183407e2f92"} Apr 20 13:31:20.647714 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:20.647142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:20.647714 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:20.647316 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:20.647714 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:20.647383 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs podName:e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.647365395 +0000 UTC m=+10.177291758 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs") pod "network-metrics-daemon-55n9j" (UID: "e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:20.848981 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:20.848941 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:20.849156 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:20.849138 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:20.849252 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:20.849162 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:20.849252 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:20.849176 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cszcv for pod openshift-network-diagnostics/network-check-target-mfqvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:20.849252 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:20.849244 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv podName:e2b1c838-35ef-4d7c-898c-5604961fd9aa nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.849224702 +0000 UTC m=+10.379151076 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cszcv" (UniqueName: "kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv") pod "network-check-target-mfqvp" (UID: "e2b1c838-35ef-4d7c-898c-5604961fd9aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:21.130313 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:21.130268 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:21.130502 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:21.130417 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:22.130357 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:22.130323 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:22.130872 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:22.130467 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:23.131001 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:23.130481 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:23.131001 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:23.130618 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:24.130025 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:24.129930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:24.130208 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:24.130068 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:24.678379 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:24.678296 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:24.678954 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:24.678432 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:24.678954 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:24.678508 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs podName:e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:32.678487751 +0000 UTC m=+18.208414116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs") pod "network-metrics-daemon-55n9j" (UID: "e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:24.880118 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:24.880040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:24.880276 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:24.880219 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:24.880276 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:24.880245 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:24.880276 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:24.880258 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cszcv for pod openshift-network-diagnostics/network-check-target-mfqvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:24.880440 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:24.880325 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv podName:e2b1c838-35ef-4d7c-898c-5604961fd9aa nodeName:}" failed. No retries permitted until 2026-04-20 13:31:32.880306103 +0000 UTC m=+18.410232467 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cszcv" (UniqueName: "kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv") pod "network-check-target-mfqvp" (UID: "e2b1c838-35ef-4d7c-898c-5604961fd9aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:25.131034 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:25.130567 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:25.131034 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:25.130679 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:26.129849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:26.129816 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:26.130309 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:26.129966 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:27.130136 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:27.130094 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:27.130603 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:27.130232 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:28.129762 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:28.129720 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:28.129924 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:28.129882 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:29.130110 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:29.130076 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:29.130568 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:29.130201 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:30.129600 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:30.129575 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:30.129785 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:30.129687 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:31.130160 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:31.130129 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:31.130582 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:31.130252 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:32.130388 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:32.130343 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:32.130850 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:32.130495 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:32.732449 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:32.732411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:32.732627 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:32.732604 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:32.732706 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:32.732688 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs podName:e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:48.732665254 +0000 UTC m=+34.262591628 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs") pod "network-metrics-daemon-55n9j" (UID: "e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:32.934413 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:32.934378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:32.934601 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:32.934523 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:32.934601 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:32.934547 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:32.934601 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:32.934560 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cszcv for pod openshift-network-diagnostics/network-check-target-mfqvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:32.934775 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:32.934619 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv podName:e2b1c838-35ef-4d7c-898c-5604961fd9aa nodeName:}" failed. No retries permitted until 2026-04-20 13:31:48.934602066 +0000 UTC m=+34.464528424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cszcv" (UniqueName: "kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv") pod "network-check-target-mfqvp" (UID: "e2b1c838-35ef-4d7c-898c-5604961fd9aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:33.129840 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:33.129738 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:33.130008 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:33.129888 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:34.130456 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:34.130423 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:34.130839 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:34.130573 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:35.130776 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.130541 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:35.131202 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:35.130872 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:35.288480 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.288450 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xnv4k" event={"ID":"c062d4f4-2415-4685-915a-14cbd0991ab3","Type":"ContainerStarted","Data":"1d798e4d2b8048fb7333345e99c16201a66629a19ac3a64d2b6acd92a99efc95"} Apr 20 13:31:35.289899 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.289873 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-48ws4" event={"ID":"fe989244-9412-433d-9a95-1acaacc7f0cb","Type":"ContainerStarted","Data":"f4725e18116b59c48de8401f998ed834a9d7e4f3f4e393d437732e7e9cbb4cb6"} Apr 20 13:31:35.291065 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.291042 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s9jnj" event={"ID":"d1caa7df-6d09-474b-b1e5-e18a510edd97","Type":"ContainerStarted","Data":"2f21885d35f8f3cc7dc7775438504041d104f025c71f9f3775a895b8ece4b245"} Apr 20 13:31:35.292080 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.292060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" event={"ID":"3ed7a5eb-6816-49f2-bd12-bed6a3be081f","Type":"ContainerStarted","Data":"f19ed2c852a03762608671ecb69906bad21c821a479b5efdc4ab6da44f35f350"} Apr 20 13:31:35.293016 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.292993 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" event={"ID":"650b481f-9321-4709-a40c-b7e7ad6e6429","Type":"ContainerStarted","Data":"b4adbabf13f79e9f190b33c3b186176cbf1d45c474b6811e2429705a3c9c6578"} Apr 20 13:31:35.294872 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.294852 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qvdjn" event={"ID":"d282e4a7-f5fa-463a-a056-646bf858c554","Type":"ContainerStarted","Data":"b96d1c91dc3d5b55dcdd1215773d33ef9ea2d30985d26e1bba3557f8e295da59"} Apr 20 13:31:35.296004 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.295988 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerStarted","Data":"fb91d13145edde19c62c0ae83c5f9bc2ed320c4e5e5b74948c49eefc703b1f38"} Apr 20 13:31:35.304515 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.304413 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xnv4k" podStartSLOduration=3.068055766 podStartE2EDuration="20.304380684s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:17.705151235 +0000 UTC m=+3.235077592" lastFinishedPulling="2026-04-20 13:31:34.941476148 +0000 UTC m=+20.471402510" observedRunningTime="2026-04-20 13:31:35.304378492 +0000 UTC m=+20.834304876" watchObservedRunningTime="2026-04-20 13:31:35.304380684 +0000 UTC m=+20.834307066" Apr 20 13:31:35.304926 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.304898 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-1.ec2.internal" podStartSLOduration=19.304888829 podStartE2EDuration="19.304888829s" podCreationTimestamp="2026-04-20 13:31:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:31:20.278710749 +0000 UTC m=+5.808637131" watchObservedRunningTime="2026-04-20 13:31:35.304888829 +0000 UTC m=+20.834815209" Apr 20 13:31:35.378840 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.378797 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-48ws4" podStartSLOduration=3.144925889 podStartE2EDuration="20.378780905s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:17.707190221 +0000 UTC m=+3.237116596" lastFinishedPulling="2026-04-20 13:31:34.941045239 +0000 UTC m=+20.470971612" observedRunningTime="2026-04-20 13:31:35.358840118 +0000 UTC m=+20.888766498" watchObservedRunningTime="2026-04-20 13:31:35.378780905 +0000 UTC m=+20.908707284" Apr 20 13:31:35.379048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.379023 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s9jnj" podStartSLOduration=3.10469884 podStartE2EDuration="20.379017399s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:17.70378347 +0000 UTC m=+3.233709836" lastFinishedPulling="2026-04-20 13:31:34.978102033 +0000 UTC m=+20.508028395" observedRunningTime="2026-04-20 13:31:35.378642492 +0000 UTC m=+20.908568871" watchObservedRunningTime="2026-04-20 13:31:35.379017399 +0000 UTC m=+20.908943778" Apr 20 13:31:35.396299 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.396239 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qvdjn" podStartSLOduration=8.089411363 podStartE2EDuration="20.396217752s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:17.696994998 +0000 UTC m=+3.226921369" lastFinishedPulling="2026-04-20 13:31:30.0038014 +0000 UTC m=+15.533727758" observedRunningTime="2026-04-20 13:31:35.395524583 +0000 UTC m=+20.925450963" watchObservedRunningTime="2026-04-20 13:31:35.396217752 +0000 UTC m=+20.926144134" Apr 20 13:31:35.753695 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.753499 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:35.754036 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:35.754017 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:36.130278 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.130246 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:36.130455 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:36.130353 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:36.267220 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.267196 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 13:31:36.299405 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.299215 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-75q77" event={"ID":"e0388411-4485-4a66-9511-1c06b60790d7","Type":"ContainerStarted","Data":"7795dcf9d3abacda90392d96bb1457ae4baa620634ba7c681e3d7a1738f786fc"} Apr 20 13:31:36.300766 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.300726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" event={"ID":"3ed7a5eb-6816-49f2-bd12-bed6a3be081f","Type":"ContainerStarted","Data":"9f35edf5422f25794775fa75a11e84be7a89ac4ab7b4b0026221c168f472b187"} Apr 20 13:31:36.301984 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.301959 2573 generic.go:358] "Generic (PLEG): container finished" podID="650b481f-9321-4709-a40c-b7e7ad6e6429" containerID="b4adbabf13f79e9f190b33c3b186176cbf1d45c474b6811e2429705a3c9c6578" exitCode=0 Apr 20 13:31:36.302075 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.302041 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" event={"ID":"650b481f-9321-4709-a40c-b7e7ad6e6429","Type":"ContainerDied","Data":"b4adbabf13f79e9f190b33c3b186176cbf1d45c474b6811e2429705a3c9c6578"} Apr 20 13:31:36.304384 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.304366 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:31:36.304660 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.304643 2573 generic.go:358] "Generic (PLEG): container finished" podID="884c5a2b-9d81-40ae-a58b-9b1298785f9b" containerID="a23ef4d029805d36cadd271a1dd50aa33b3422530c2136c90d1e09f1d81feb72" exitCode=1 Apr 20 13:31:36.304739 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.304716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerStarted","Data":"8bb27994c1b581886e5d0561dd936f9c0ed5e85ab751cc45b7d895802b2a9c7b"} Apr 20 13:31:36.304818 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.304767 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerStarted","Data":"108c1e4d67bc7e89ae48e736d3fe304adf7a8953e4078d6117b20cbb57ba1121"} Apr 20 13:31:36.304818 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.304783 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerStarted","Data":"8de231f10b9673b79c09f18df51a67aa054ba87486941e326593cf47768d4199"} Apr 20 13:31:36.304818 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.304795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerStarted","Data":"c91e9680fcb12fd5727eb0a80f781e6a8b5f31462dee7595600ae80fdd5f37d4"} Apr 20 13:31:36.304818 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.304807 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerDied","Data":"a23ef4d029805d36cadd271a1dd50aa33b3422530c2136c90d1e09f1d81feb72"} Apr 20 13:31:36.305952 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.305928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v9qp2" event={"ID":"a929d3eb-e608-498a-85b2-7ac9ff81b424","Type":"ContainerStarted","Data":"578d5a1e27d07ef334769b1e8479e4038a32f36e24eb81a5335d815541a15884"} Apr 20 13:31:36.313470 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.313437 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-75q77" podStartSLOduration=4.072048631 podStartE2EDuration="21.313426192s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:17.699795401 +0000 UTC m=+3.229721773" lastFinishedPulling="2026-04-20 13:31:34.941172976 +0000 UTC m=+20.471099334" observedRunningTime="2026-04-20 13:31:36.3130787 +0000 UTC m=+21.843005091" watchObservedRunningTime="2026-04-20 13:31:36.313426192 +0000 UTC m=+21.843352571" Apr 20 13:31:36.353453 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:36.353411 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-v9qp2" podStartSLOduration=4.119861895 podStartE2EDuration="21.353400533s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:17.707552815 +0000 UTC m=+3.237479184" lastFinishedPulling="2026-04-20 13:31:34.941091448 +0000 UTC m=+20.471017822" observedRunningTime="2026-04-20 13:31:36.328159491 +0000 UTC m=+21.858085865" watchObservedRunningTime="2026-04-20 13:31:36.353400533 +0000 UTC m=+21.883326912" Apr 20 13:31:37.082230 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:37.082117 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T13:31:36.267215562Z","UUID":"5d627fa3-2b83-477c-8452-9269711e192b","Handler":null,"Name":"","Endpoint":""} Apr 20 13:31:37.084101 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:37.084081 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 13:31:37.084218 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:37.084108 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 13:31:37.129911 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:37.129882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:37.130095 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:37.130012 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:37.309903 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:37.309864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" event={"ID":"3ed7a5eb-6816-49f2-bd12-bed6a3be081f","Type":"ContainerStarted","Data":"79b78c518b604aa915d3a9bd4166af61c3df185910595a12b185437e031d7c5e"} Apr 20 13:31:37.309903 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:37.309878 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 13:31:37.327547 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:37.327491 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nvzpg" podStartSLOduration=2.944190927 podStartE2EDuration="22.327476952s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:17.699744333 +0000 UTC m=+3.229670706" lastFinishedPulling="2026-04-20 13:31:37.083030372 +0000 UTC m=+22.612956731" observedRunningTime="2026-04-20 13:31:37.32747594 +0000 UTC m=+22.857402321" watchObservedRunningTime="2026-04-20 13:31:37.327476952 +0000 UTC m=+22.857403333" Apr 20 13:31:37.789785 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:37.789739 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:37.790420 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:37.790356 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xnv4k" Apr 20 13:31:38.130330 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:38.130248 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:38.130484 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:38.130390 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:38.314904 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:38.314874 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:31:38.315437 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:38.315213 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerStarted","Data":"6b207ddcfdf443af715e3a967034187b3ed5ded6f0c1aedeb962716452e7ae87"} Apr 20 13:31:39.129714 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:39.129683 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:39.129881 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:39.129817 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:40.130456 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:40.130422 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:40.131068 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:40.130547 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:40.322228 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:40.321789 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:31:40.322311 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:40.322282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerStarted","Data":"4e27bd1fae44614aa0c7615294b79f11410ae9192e2d9ee2f494fdfa313a7909"} Apr 20 13:31:40.322901 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:40.322629 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:40.322901 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:40.322854 2573 scope.go:117] "RemoveContainer" containerID="a23ef4d029805d36cadd271a1dd50aa33b3422530c2136c90d1e09f1d81feb72" Apr 20 13:31:40.339650 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:40.339624 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:41.129765 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:41.129718 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:41.129929 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:41.129829 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:41.325896 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:41.325860 2573 generic.go:358] "Generic (PLEG): container finished" podID="650b481f-9321-4709-a40c-b7e7ad6e6429" containerID="d54393f9f3aa6cfd3e9688d72a7859bfcae2f72d568a25c8461033cd22083d97" exitCode=0 Apr 20 13:31:41.326371 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:41.325951 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" event={"ID":"650b481f-9321-4709-a40c-b7e7ad6e6429","Type":"ContainerDied","Data":"d54393f9f3aa6cfd3e9688d72a7859bfcae2f72d568a25c8461033cd22083d97"} Apr 20 13:31:41.329321 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:41.329304 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:31:41.329607 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:41.329585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" event={"ID":"884c5a2b-9d81-40ae-a58b-9b1298785f9b","Type":"ContainerStarted","Data":"42b273f6cc89b1d2784d8d4ffa6543be984263abc3fa3c5fc11e0032bc16c112"} Apr 20 13:31:41.329903 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:41.329888 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:41.329994 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:41.329915 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:41.343869 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:41.343844 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:31:41.379714 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:41.377936 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" podStartSLOduration=9.070955925 podStartE2EDuration="26.377917677s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:17.708689697 +0000 UTC m=+3.238616069" lastFinishedPulling="2026-04-20 13:31:35.015651464 +0000 UTC m=+20.545577821" observedRunningTime="2026-04-20 13:31:41.376790824 +0000 UTC m=+26.906719038" watchObservedRunningTime="2026-04-20 13:31:41.377917677 +0000 UTC m=+26.907844059" Apr 20 13:31:42.129725 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:42.129692 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:42.129901 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:42.129816 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:42.332564 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:42.332536 2573 generic.go:358] "Generic (PLEG): container finished" podID="650b481f-9321-4709-a40c-b7e7ad6e6429" containerID="91cd9da6442e60b158ab48e35a7248538532250d848fb36a7c4f40c2594e68f0" exitCode=0 Apr 20 13:31:42.332930 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:42.332623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" event={"ID":"650b481f-9321-4709-a40c-b7e7ad6e6429","Type":"ContainerDied","Data":"91cd9da6442e60b158ab48e35a7248538532250d848fb36a7c4f40c2594e68f0"} Apr 20 13:31:43.129806 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:43.129781 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:43.129946 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:43.129873 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:43.336372 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:43.336340 2573 generic.go:358] "Generic (PLEG): container finished" podID="650b481f-9321-4709-a40c-b7e7ad6e6429" containerID="d9bdb5aa191e9f6813584a41f2b6c436e5e7b59e4f882f30152179de257323e4" exitCode=0 Apr 20 13:31:43.336860 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:43.336409 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" event={"ID":"650b481f-9321-4709-a40c-b7e7ad6e6429","Type":"ContainerDied","Data":"d9bdb5aa191e9f6813584a41f2b6c436e5e7b59e4f882f30152179de257323e4"} Apr 20 13:31:44.129600 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:44.129570 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:44.129787 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:44.129690 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:45.130591 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:45.130554 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:45.131029 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:45.130645 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:46.130119 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:46.130090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:46.130287 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:46.130219 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:47.130127 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:47.129932 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:47.130644 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:47.130234 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:48.130058 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:48.130029 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:48.130336 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:48.130155 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:48.754979 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:48.754952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:48.755148 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:48.755096 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:48.755199 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:48.755175 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs podName:e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35 nodeName:}" failed. No retries permitted until 2026-04-20 13:32:20.755157892 +0000 UTC m=+66.285084250 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs") pod "network-metrics-daemon-55n9j" (UID: "e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:48.957150 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:48.957103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:48.957312 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:48.957263 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:48.957312 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:48.957285 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:48.957312 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:48.957295 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cszcv for pod openshift-network-diagnostics/network-check-target-mfqvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:48.957414 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:48.957351 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv podName:e2b1c838-35ef-4d7c-898c-5604961fd9aa nodeName:}" failed. No retries permitted until 2026-04-20 13:32:20.957336921 +0000 UTC m=+66.487263280 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cszcv" (UniqueName: "kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv") pod "network-check-target-mfqvp" (UID: "e2b1c838-35ef-4d7c-898c-5604961fd9aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:49.130555 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:49.130530 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:49.130901 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:49.130626 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:49.348850 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:49.348761 2573 generic.go:358] "Generic (PLEG): container finished" podID="650b481f-9321-4709-a40c-b7e7ad6e6429" containerID="66277b0f638de2d7a5840b7c2f2340f0b187f0d04a1c422bf6eef3a7c1febc54" exitCode=0 Apr 20 13:31:49.348850 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:49.348816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" event={"ID":"650b481f-9321-4709-a40c-b7e7ad6e6429","Type":"ContainerDied","Data":"66277b0f638de2d7a5840b7c2f2340f0b187f0d04a1c422bf6eef3a7c1febc54"} Apr 20 13:31:50.130371 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:50.130342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:50.130543 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:50.130435 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:50.352930 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:50.352897 2573 generic.go:358] "Generic (PLEG): container finished" podID="650b481f-9321-4709-a40c-b7e7ad6e6429" containerID="fdd862c18c10fc9814d15207386057d262d4b356aca2d88510ed6705231a26e4" exitCode=0 Apr 20 13:31:50.353294 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:50.352937 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" event={"ID":"650b481f-9321-4709-a40c-b7e7ad6e6429","Type":"ContainerDied","Data":"fdd862c18c10fc9814d15207386057d262d4b356aca2d88510ed6705231a26e4"} Apr 20 13:31:51.129768 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:51.129723 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:51.129957 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:51.129881 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:51.357549 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:51.357513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" event={"ID":"650b481f-9321-4709-a40c-b7e7ad6e6429","Type":"ContainerStarted","Data":"d652642011c295e6c8116210bcfcf227004ec757bb25bc885845d5018be58aac"} Apr 20 13:31:51.387880 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:51.387793 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bwnwj" podStartSLOduration=5.332530508 podStartE2EDuration="36.387778955s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:17.699561171 +0000 UTC m=+3.229487528" lastFinishedPulling="2026-04-20 13:31:48.754809599 +0000 UTC m=+34.284735975" observedRunningTime="2026-04-20 13:31:51.387232339 +0000 UTC m=+36.917158719" watchObservedRunningTime="2026-04-20 13:31:51.387778955 +0000 UTC m=+36.917705335" Apr 20 13:31:52.129676 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:52.129639 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:52.129900 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:52.129789 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:53.130513 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:53.130474 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:53.130945 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:53.130587 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:54.130172 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:54.130138 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:54.130338 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:54.130271 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:54.160805 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:54.160775 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mfqvp"] Apr 20 13:31:54.161498 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:54.160878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:54.161498 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:54.160961 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:54.163928 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:54.163903 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-55n9j"] Apr 20 13:31:54.361957 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:54.361924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:54.362140 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:54.362028 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:56.129618 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:56.129541 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:56.129995 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:56.129538 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:56.129995 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:56.129642 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:56.129995 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:56.129694 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:58.129902 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.129626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:31:58.129902 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.129634 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:31:58.130280 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:58.129902 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mfqvp" podUID="e2b1c838-35ef-4d7c-898c-5604961fd9aa" Apr 20 13:31:58.130280 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:58.129976 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-55n9j" podUID="e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35" Apr 20 13:31:58.245837 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.245811 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-1.ec2.internal" event="NodeReady" Apr 20 13:31:58.245991 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.245921 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 13:31:58.308161 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.308125 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wwvkk"] Apr 20 13:31:58.319572 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.319543 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6cvrv"] Apr 20 13:31:58.319737 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.319714 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.321984 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.321965 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 13:31:58.322181 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.322166 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 13:31:58.322576 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.322556 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c8wr5\"" Apr 20 13:31:58.329249 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.329230 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wwvkk"] Apr 20 13:31:58.329339 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.329327 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6cvrv" Apr 20 13:31:58.331351 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.331332 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6cvrv"] Apr 20 13:31:58.331880 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.331861 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 13:31:58.331977 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.331958 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 13:31:58.332342 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.332328 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 13:31:58.332393 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.332362 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mqqqc\"" Apr 20 13:31:58.409617 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.409532 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-l2cjz"] Apr 20 13:31:58.418984 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.418956 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.422609 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.422585 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vbdzl\"" Apr 20 13:31:58.422726 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.422624 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 13:31:58.422726 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.422592 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 13:31:58.422726 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.422697 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 13:31:58.423095 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.423080 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 13:31:58.426638 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.426618 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b848630b-ad2b-4d47-be50-9df586a4911a-metrics-tls\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.426783 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.426667 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b848630b-ad2b-4d47-be50-9df586a4911a-config-volume\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.426783 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.426741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b848630b-ad2b-4d47-be50-9df586a4911a-tmp-dir\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.426915 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.426798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khf2t\" (UniqueName: \"kubernetes.io/projected/b848630b-ad2b-4d47-be50-9df586a4911a-kube-api-access-khf2t\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.433773 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.433735 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l2cjz"] Apr 20 13:31:58.528054 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/64a3417b-0d23-4d64-bb08-acd796c20cc1-data-volume\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.528232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528062 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khf2t\" (UniqueName: \"kubernetes.io/projected/b848630b-ad2b-4d47-be50-9df586a4911a-kube-api-access-khf2t\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.528232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/64a3417b-0d23-4d64-bb08-acd796c20cc1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.528232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528101 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3909667c-16ab-4114-9adb-f5a8ef49a1fe-cert\") pod \"ingress-canary-6cvrv\" (UID: \"3909667c-16ab-4114-9adb-f5a8ef49a1fe\") " pod="openshift-ingress-canary/ingress-canary-6cvrv" Apr 20 13:31:58.528232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxgvs\" (UniqueName: \"kubernetes.io/projected/64a3417b-0d23-4d64-bb08-acd796c20cc1-kube-api-access-cxgvs\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.528232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b848630b-ad2b-4d47-be50-9df586a4911a-metrics-tls\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.528232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528156 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5bn\" (UniqueName: \"kubernetes.io/projected/3909667c-16ab-4114-9adb-f5a8ef49a1fe-kube-api-access-4p5bn\") pod \"ingress-canary-6cvrv\" (UID: \"3909667c-16ab-4114-9adb-f5a8ef49a1fe\") " pod="openshift-ingress-canary/ingress-canary-6cvrv" Apr 20 13:31:58.528232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b848630b-ad2b-4d47-be50-9df586a4911a-config-volume\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.528471 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528249 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/64a3417b-0d23-4d64-bb08-acd796c20cc1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.528471 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b848630b-ad2b-4d47-be50-9df586a4911a-tmp-dir\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.528471 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/64a3417b-0d23-4d64-bb08-acd796c20cc1-crio-socket\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.528636 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b848630b-ad2b-4d47-be50-9df586a4911a-tmp-dir\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.528848 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.528826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b848630b-ad2b-4d47-be50-9df586a4911a-config-volume\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.532304 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.532283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b848630b-ad2b-4d47-be50-9df586a4911a-metrics-tls\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.538543 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.538519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khf2t\" (UniqueName: \"kubernetes.io/projected/b848630b-ad2b-4d47-be50-9df586a4911a-kube-api-access-khf2t\") pod \"dns-default-wwvkk\" (UID: \"b848630b-ad2b-4d47-be50-9df586a4911a\") " pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.629526 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.629493 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wwvkk" Apr 20 13:31:58.629671 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.629605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/64a3417b-0d23-4d64-bb08-acd796c20cc1-crio-socket\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.629671 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.629633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/64a3417b-0d23-4d64-bb08-acd796c20cc1-data-volume\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.629671 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.629657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/64a3417b-0d23-4d64-bb08-acd796c20cc1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.629873 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.629673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3909667c-16ab-4114-9adb-f5a8ef49a1fe-cert\") pod \"ingress-canary-6cvrv\" (UID: \"3909667c-16ab-4114-9adb-f5a8ef49a1fe\") " pod="openshift-ingress-canary/ingress-canary-6cvrv" Apr 20 13:31:58.629873 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.629693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxgvs\" (UniqueName: \"kubernetes.io/projected/64a3417b-0d23-4d64-bb08-acd796c20cc1-kube-api-access-cxgvs\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.629873 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.629733 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5bn\" (UniqueName: \"kubernetes.io/projected/3909667c-16ab-4114-9adb-f5a8ef49a1fe-kube-api-access-4p5bn\") pod \"ingress-canary-6cvrv\" (UID: \"3909667c-16ab-4114-9adb-f5a8ef49a1fe\") " pod="openshift-ingress-canary/ingress-canary-6cvrv" Apr 20 13:31:58.629873 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.629790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/64a3417b-0d23-4d64-bb08-acd796c20cc1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.630045 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.629894 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/64a3417b-0d23-4d64-bb08-acd796c20cc1-crio-socket\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.630357 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.630245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/64a3417b-0d23-4d64-bb08-acd796c20cc1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.632051 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.632031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3909667c-16ab-4114-9adb-f5a8ef49a1fe-cert\") pod \"ingress-canary-6cvrv\" (UID: \"3909667c-16ab-4114-9adb-f5a8ef49a1fe\") " pod="openshift-ingress-canary/ingress-canary-6cvrv" Apr 20 13:31:58.635222 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.635200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/64a3417b-0d23-4d64-bb08-acd796c20cc1-data-volume\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.637082 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.637058 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/64a3417b-0d23-4d64-bb08-acd796c20cc1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.640394 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.640367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxgvs\" (UniqueName: \"kubernetes.io/projected/64a3417b-0d23-4d64-bb08-acd796c20cc1-kube-api-access-cxgvs\") pod \"insights-runtime-extractor-l2cjz\" (UID: \"64a3417b-0d23-4d64-bb08-acd796c20cc1\") " pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.640677 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.640658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5bn\" (UniqueName: \"kubernetes.io/projected/3909667c-16ab-4114-9adb-f5a8ef49a1fe-kube-api-access-4p5bn\") pod \"ingress-canary-6cvrv\" (UID: \"3909667c-16ab-4114-9adb-f5a8ef49a1fe\") " pod="openshift-ingress-canary/ingress-canary-6cvrv" Apr 20 13:31:58.727131 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.727103 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l2cjz" Apr 20 13:31:58.787024 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.786997 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wwvkk"] Apr 20 13:31:58.791439 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:58.791413 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb848630b_ad2b_4d47_be50_9df586a4911a.slice/crio-5b736436b4a0a64daf237bed451cf0184b026c7e6dd1db31ab292eec639043c6 WatchSource:0}: Error finding container 5b736436b4a0a64daf237bed451cf0184b026c7e6dd1db31ab292eec639043c6: Status 404 returned error can't find the container with id 5b736436b4a0a64daf237bed451cf0184b026c7e6dd1db31ab292eec639043c6 Apr 20 13:31:58.849554 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.849529 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l2cjz"] Apr 20 13:31:58.855182 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:58.855154 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64a3417b_0d23_4d64_bb08_acd796c20cc1.slice/crio-2f33c6145c7850d3fa5e2560d7e24c29b0f879c3563e55f98c0af6e48e265cf4 WatchSource:0}: Error finding container 2f33c6145c7850d3fa5e2560d7e24c29b0f879c3563e55f98c0af6e48e265cf4: Status 404 returned error can't find the container with id 2f33c6145c7850d3fa5e2560d7e24c29b0f879c3563e55f98c0af6e48e265cf4 Apr 20 13:31:58.937703 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:58.937682 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6cvrv" Apr 20 13:31:59.056579 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.056548 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6cvrv"] Apr 20 13:31:59.065966 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:31:59.065938 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3909667c_16ab_4114_9adb_f5a8ef49a1fe.slice/crio-5f3cdc9c5dd0d76f4205326dc676978b064fe792c69cf12761093f445dd80bbc WatchSource:0}: Error finding container 5f3cdc9c5dd0d76f4205326dc676978b064fe792c69cf12761093f445dd80bbc: Status 404 returned error can't find the container with id 5f3cdc9c5dd0d76f4205326dc676978b064fe792c69cf12761093f445dd80bbc Apr 20 13:31:59.372420 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.372159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6cvrv" event={"ID":"3909667c-16ab-4114-9adb-f5a8ef49a1fe","Type":"ContainerStarted","Data":"5f3cdc9c5dd0d76f4205326dc676978b064fe792c69cf12761093f445dd80bbc"} Apr 20 13:31:59.373719 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.373654 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2cjz" event={"ID":"64a3417b-0d23-4d64-bb08-acd796c20cc1","Type":"ContainerStarted","Data":"bc8cb76dcfba3ef88645d01af59f2de819a4a47304321e4551a920adeccf862a"} Apr 20 13:31:59.373719 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.373696 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2cjz" event={"ID":"64a3417b-0d23-4d64-bb08-acd796c20cc1","Type":"ContainerStarted","Data":"2f33c6145c7850d3fa5e2560d7e24c29b0f879c3563e55f98c0af6e48e265cf4"} Apr 20 13:31:59.374973 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.374941 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwvkk" event={"ID":"b848630b-ad2b-4d47-be50-9df586a4911a","Type":"ContainerStarted","Data":"5b736436b4a0a64daf237bed451cf0184b026c7e6dd1db31ab292eec639043c6"} Apr 20 13:31:59.435404 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.435368 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq"] Apr 20 13:31:59.456146 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.456118 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq"] Apr 20 13:31:59.456306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.456184 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" Apr 20 13:31:59.458884 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.458637 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-mrbdf\"" Apr 20 13:31:59.458884 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.458806 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 13:31:59.535380 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.535342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/77018601-43ed-4d16-b80f-22d590fbb6ea-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7x7xq\" (UID: \"77018601-43ed-4d16-b80f-22d590fbb6ea\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" Apr 20 13:31:59.635839 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:31:59.635763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/77018601-43ed-4d16-b80f-22d590fbb6ea-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7x7xq\" (UID: \"77018601-43ed-4d16-b80f-22d590fbb6ea\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" Apr 20 13:31:59.635978 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:59.635913 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 20 13:31:59.636037 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:31:59.636000 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77018601-43ed-4d16-b80f-22d590fbb6ea-tls-certificates podName:77018601-43ed-4d16-b80f-22d590fbb6ea nodeName:}" failed. No retries permitted until 2026-04-20 13:32:00.135976854 +0000 UTC m=+45.665903214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/77018601-43ed-4d16-b80f-22d590fbb6ea-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-7x7xq" (UID: "77018601-43ed-4d16-b80f-22d590fbb6ea") : secret "prometheus-operator-admission-webhook-tls" not found Apr 20 13:32:00.129636 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.129600 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:32:00.129832 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.129600 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:32:00.132307 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.132284 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 13:32:00.133192 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.133171 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 13:32:00.133314 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.133272 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wf844\"" Apr 20 13:32:00.133314 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.133272 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 13:32:00.133424 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.133379 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gqjcl\"" Apr 20 13:32:00.139304 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.139277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/77018601-43ed-4d16-b80f-22d590fbb6ea-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7x7xq\" (UID: \"77018601-43ed-4d16-b80f-22d590fbb6ea\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" Apr 20 13:32:00.144028 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.144009 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/77018601-43ed-4d16-b80f-22d590fbb6ea-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-7x7xq\" (UID: \"77018601-43ed-4d16-b80f-22d590fbb6ea\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" Apr 20 13:32:00.369675 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:00.369643 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" Apr 20 13:32:01.332158 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:01.332107 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq"] Apr 20 13:32:01.336153 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:01.336125 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77018601_43ed_4d16_b80f_22d590fbb6ea.slice/crio-735f0035baba38a4e4ad8401da0c6ce8f396a09a052588c55f974996362ead64 WatchSource:0}: Error finding container 735f0035baba38a4e4ad8401da0c6ce8f396a09a052588c55f974996362ead64: Status 404 returned error can't find the container with id 735f0035baba38a4e4ad8401da0c6ce8f396a09a052588c55f974996362ead64 Apr 20 13:32:01.380159 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:01.380134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2cjz" event={"ID":"64a3417b-0d23-4d64-bb08-acd796c20cc1","Type":"ContainerStarted","Data":"6af3434d2349fd7aafabbd1b6f8509d7ae532a243905ab2b2d1787f35ca0e576"} Apr 20 13:32:01.382076 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:01.382050 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" event={"ID":"77018601-43ed-4d16-b80f-22d590fbb6ea","Type":"ContainerStarted","Data":"735f0035baba38a4e4ad8401da0c6ce8f396a09a052588c55f974996362ead64"} Apr 20 13:32:02.386935 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:02.386576 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwvkk" event={"ID":"b848630b-ad2b-4d47-be50-9df586a4911a","Type":"ContainerStarted","Data":"8177e4381aa069786c5c736dcb5bfcb0fdbd6c5f315a695bc66450b9e01d5685"} Apr 20 13:32:02.386935 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:02.386869 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wwvkk" Apr 20 13:32:02.386935 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:02.386885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wwvkk" event={"ID":"b848630b-ad2b-4d47-be50-9df586a4911a","Type":"ContainerStarted","Data":"2c7e857dd56f1b04429e8ab01b01aa5a01102a4801c48f2e310b93b52784e48b"} Apr 20 13:32:02.388089 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:02.388059 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6cvrv" event={"ID":"3909667c-16ab-4114-9adb-f5a8ef49a1fe","Type":"ContainerStarted","Data":"4ec2f59ebdae214d7e2ac3ce1956b132c755965158f0e9108373ae42ef86ec56"} Apr 20 13:32:02.407999 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:02.407958 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wwvkk" podStartSLOduration=2.013028713 podStartE2EDuration="4.40794595s" podCreationTimestamp="2026-04-20 13:31:58 +0000 UTC" firstStartedPulling="2026-04-20 13:31:58.793429583 +0000 UTC m=+44.323355944" lastFinishedPulling="2026-04-20 13:32:01.18834682 +0000 UTC m=+46.718273181" observedRunningTime="2026-04-20 13:32:02.406639321 +0000 UTC m=+47.936565716" watchObservedRunningTime="2026-04-20 13:32:02.40794595 +0000 UTC m=+47.937872308" Apr 20 13:32:02.424202 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:02.424096 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6cvrv" podStartSLOduration=1.98589467 podStartE2EDuration="4.424081053s" podCreationTimestamp="2026-04-20 13:31:58 +0000 UTC" firstStartedPulling="2026-04-20 13:31:59.067715881 +0000 UTC m=+44.597642240" lastFinishedPulling="2026-04-20 13:32:01.505902063 +0000 UTC m=+47.035828623" observedRunningTime="2026-04-20 13:32:02.423102707 +0000 UTC m=+47.953029087" watchObservedRunningTime="2026-04-20 13:32:02.424081053 +0000 UTC m=+47.954007435" Apr 20 13:32:04.395133 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:04.395096 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l2cjz" event={"ID":"64a3417b-0d23-4d64-bb08-acd796c20cc1","Type":"ContainerStarted","Data":"751383e94deec4e6362a9185f9198ac23505a6b4100508eb57ce6a0eb677c966"} Apr 20 13:32:04.396426 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:04.396402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" event={"ID":"77018601-43ed-4d16-b80f-22d590fbb6ea","Type":"ContainerStarted","Data":"eb4058728f58857497c39da6dee5572e7bde708e535757d4dfc2445ca0728007"} Apr 20 13:32:04.396592 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:04.396580 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" Apr 20 13:32:04.401082 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:04.401062 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" Apr 20 13:32:04.419963 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:04.419919 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-l2cjz" podStartSLOduration=1.9166907800000001 podStartE2EDuration="6.419906351s" podCreationTimestamp="2026-04-20 13:31:58 +0000 UTC" firstStartedPulling="2026-04-20 13:31:58.932259739 +0000 UTC m=+44.462186097" lastFinishedPulling="2026-04-20 13:32:03.435475307 +0000 UTC m=+48.965401668" observedRunningTime="2026-04-20 13:32:04.419194906 +0000 UTC m=+49.949121315" watchObservedRunningTime="2026-04-20 13:32:04.419906351 +0000 UTC m=+49.949832736" Apr 20 13:32:04.440183 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:04.440140 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-7x7xq" podStartSLOduration=3.348325527 podStartE2EDuration="5.440127539s" podCreationTimestamp="2026-04-20 13:31:59 +0000 UTC" firstStartedPulling="2026-04-20 13:32:01.340081236 +0000 UTC m=+46.870007600" lastFinishedPulling="2026-04-20 13:32:03.431883251 +0000 UTC m=+48.961809612" observedRunningTime="2026-04-20 13:32:04.440085127 +0000 UTC m=+49.970011508" watchObservedRunningTime="2026-04-20 13:32:04.440127539 +0000 UTC m=+49.970053918" Apr 20 13:32:05.502606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.502578 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-96m7g"] Apr 20 13:32:05.539456 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.539430 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-96m7g"] Apr 20 13:32:05.539592 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.539536 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.542135 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.542102 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-2lrcn\"" Apr 20 13:32:05.542265 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.542157 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 13:32:05.542265 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.542170 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 13:32:05.543049 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.543029 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 13:32:05.543158 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.543051 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 13:32:05.543158 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.543061 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 13:32:05.572980 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.572957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fec29c32-4927-4343-91dc-5b24cc32dd2a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.573064 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.572994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fec29c32-4927-4343-91dc-5b24cc32dd2a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.573064 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.573012 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4cpj\" (UniqueName: \"kubernetes.io/projected/fec29c32-4927-4343-91dc-5b24cc32dd2a-kube-api-access-m4cpj\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.573064 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.573041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fec29c32-4927-4343-91dc-5b24cc32dd2a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.673317 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.673278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fec29c32-4927-4343-91dc-5b24cc32dd2a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.673494 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.673339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fec29c32-4927-4343-91dc-5b24cc32dd2a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.673494 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.673363 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4cpj\" (UniqueName: \"kubernetes.io/projected/fec29c32-4927-4343-91dc-5b24cc32dd2a-kube-api-access-m4cpj\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.673494 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.673394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fec29c32-4927-4343-91dc-5b24cc32dd2a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.673659 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:32:05.673490 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 13:32:05.673659 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:32:05.673571 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fec29c32-4927-4343-91dc-5b24cc32dd2a-prometheus-operator-tls podName:fec29c32-4927-4343-91dc-5b24cc32dd2a nodeName:}" failed. No retries permitted until 2026-04-20 13:32:06.173554141 +0000 UTC m=+51.703480501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/fec29c32-4927-4343-91dc-5b24cc32dd2a-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-96m7g" (UID: "fec29c32-4927-4343-91dc-5b24cc32dd2a") : secret "prometheus-operator-tls" not found Apr 20 13:32:05.674073 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.674051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fec29c32-4927-4343-91dc-5b24cc32dd2a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.677079 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.677057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fec29c32-4927-4343-91dc-5b24cc32dd2a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:05.683325 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:05.683308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4cpj\" (UniqueName: \"kubernetes.io/projected/fec29c32-4927-4343-91dc-5b24cc32dd2a-kube-api-access-m4cpj\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:06.176380 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:06.176347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fec29c32-4927-4343-91dc-5b24cc32dd2a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:06.188242 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:06.188217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fec29c32-4927-4343-91dc-5b24cc32dd2a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-96m7g\" (UID: \"fec29c32-4927-4343-91dc-5b24cc32dd2a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:06.449101 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:06.449016 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" Apr 20 13:32:06.566492 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:06.566462 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-96m7g"] Apr 20 13:32:06.573159 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:06.573133 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfec29c32_4927_4343_91dc_5b24cc32dd2a.slice/crio-38a3c997c8bae5409c0bdc8f2beb2487d3e0b4a485e52f023427ab9aca6e6f73 WatchSource:0}: Error finding container 38a3c997c8bae5409c0bdc8f2beb2487d3e0b4a485e52f023427ab9aca6e6f73: Status 404 returned error can't find the container with id 38a3c997c8bae5409c0bdc8f2beb2487d3e0b4a485e52f023427ab9aca6e6f73 Apr 20 13:32:07.405083 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:07.405050 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" event={"ID":"fec29c32-4927-4343-91dc-5b24cc32dd2a","Type":"ContainerStarted","Data":"38a3c997c8bae5409c0bdc8f2beb2487d3e0b4a485e52f023427ab9aca6e6f73"} Apr 20 13:32:08.409116 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:08.408854 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" event={"ID":"fec29c32-4927-4343-91dc-5b24cc32dd2a","Type":"ContainerStarted","Data":"5fdaaa38c82d7960b2facce508ed867d7ee8e94eedcd96155af6c9fccf5d30fa"} Apr 20 13:32:08.409116 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:08.409037 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" event={"ID":"fec29c32-4927-4343-91dc-5b24cc32dd2a","Type":"ContainerStarted","Data":"0ed99564221df4cb2ffd73d4597831fbd225a58e748a33fa32df97ab4e39b6de"} Apr 20 13:32:08.426468 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:08.426413 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-96m7g" podStartSLOduration=1.972463697 podStartE2EDuration="3.426400515s" podCreationTimestamp="2026-04-20 13:32:05 +0000 UTC" firstStartedPulling="2026-04-20 13:32:06.575157592 +0000 UTC m=+52.105083951" lastFinishedPulling="2026-04-20 13:32:08.029094408 +0000 UTC m=+53.559020769" observedRunningTime="2026-04-20 13:32:08.425834004 +0000 UTC m=+53.955760394" watchObservedRunningTime="2026-04-20 13:32:08.426400515 +0000 UTC m=+53.956326894" Apr 20 13:32:09.877382 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.877352 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz"] Apr 20 13:32:09.902461 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.902429 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8dhct"] Apr 20 13:32:09.902610 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.902590 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:09.904928 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.904906 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 13:32:09.905049 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.904948 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 13:32:09.905123 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.905105 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-w24qc\"" Apr 20 13:32:09.915179 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.915163 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k8hjt"] Apr 20 13:32:09.915314 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.915301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:09.918540 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.918516 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 13:32:09.918624 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.918541 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 13:32:09.918624 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.918585 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 13:32:09.918741 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.918620 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-tx4p7\"" Apr 20 13:32:09.943415 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.943392 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz"] Apr 20 13:32:09.943415 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.943411 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8dhct"] Apr 20 13:32:09.943533 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.943495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:09.945655 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.945622 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 13:32:09.945655 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.945645 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-dzwpq\"" Apr 20 13:32:09.945844 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.945656 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 13:32:09.945844 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.945663 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 13:32:09.999946 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.999922 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.000055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.999961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-accelerators-collector-config\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.000055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:09.999993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98nqt\" (UniqueName: \"kubernetes.io/projected/0b5f3471-c214-4da6-8284-8a2e35239729-kube-api-access-98nqt\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.000055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-textfile\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.000186 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21e6288b-35f0-467f-85ec-0224e98f6ecf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.000186 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-wtmp\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.000186 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000132 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21e6288b-35f0-467f-85ec-0224e98f6ecf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.000289 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.000289 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000219 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b5f3471-c214-4da6-8284-8a2e35239729-metrics-client-ca\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.000289 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b5f3471-c214-4da6-8284-8a2e35239729-sys\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.000289 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21e6288b-35f0-467f-85ec-0224e98f6ecf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.000289 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8hw6\" (UniqueName: \"kubernetes.io/projected/21e6288b-35f0-467f-85ec-0224e98f6ecf-kube-api-access-m8hw6\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.000454 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000298 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0b5f3471-c214-4da6-8284-8a2e35239729-root\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.000454 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3a522c83-7471-4e5d-be7f-5175e61ac4cd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.000454 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000361 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-tls\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.000454 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000385 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scnvz\" (UniqueName: \"kubernetes.io/projected/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-api-access-scnvz\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.000454 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000402 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.000454 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000419 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a522c83-7471-4e5d-be7f-5175e61ac4cd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.000454 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.000439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.100914 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.100883 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0b5f3471-c214-4da6-8284-8a2e35239729-root\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.100923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3a522c83-7471-4e5d-be7f-5175e61ac4cd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.101056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.100959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-tls\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.100984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scnvz\" (UniqueName: \"kubernetes.io/projected/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-api-access-scnvz\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.101056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.101056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101031 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a522c83-7471-4e5d-be7f-5175e61ac4cd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.101308 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.101308 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.100986 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0b5f3471-c214-4da6-8284-8a2e35239729-root\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101308 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:32:10.101129 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 13:32:10.101308 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.101308 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101236 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-accelerators-collector-config\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101308 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101275 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98nqt\" (UniqueName: \"kubernetes.io/projected/0b5f3471-c214-4da6-8284-8a2e35239729-kube-api-access-98nqt\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101308 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:32:10.101312 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-tls podName:0b5f3471-c214-4da6-8284-8a2e35239729 nodeName:}" failed. No retries permitted until 2026-04-20 13:32:10.601291862 +0000 UTC m=+56.131218223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-tls") pod "node-exporter-k8hjt" (UID: "0b5f3471-c214-4da6-8284-8a2e35239729") : secret "node-exporter-tls" not found Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101336 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-textfile\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21e6288b-35f0-467f-85ec-0224e98f6ecf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-wtmp\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21e6288b-35f0-467f-85ec-0224e98f6ecf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b5f3471-c214-4da6-8284-8a2e35239729-metrics-client-ca\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b5f3471-c214-4da6-8284-8a2e35239729-sys\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21e6288b-35f0-467f-85ec-0224e98f6ecf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:32:10.101571 2573 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 13:32:10.101606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8hw6\" (UniqueName: \"kubernetes.io/projected/21e6288b-35f0-467f-85ec-0224e98f6ecf-kube-api-access-m8hw6\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.102075 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:32:10.101629 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e6288b-35f0-467f-85ec-0224e98f6ecf-openshift-state-metrics-tls podName:21e6288b-35f0-467f-85ec-0224e98f6ecf nodeName:}" failed. No retries permitted until 2026-04-20 13:32:10.601612985 +0000 UTC m=+56.131539357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/21e6288b-35f0-467f-85ec-0224e98f6ecf-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-p2jvz" (UID: "21e6288b-35f0-467f-85ec-0224e98f6ecf") : secret "openshift-state-metrics-tls" not found Apr 20 13:32:10.102075 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a522c83-7471-4e5d-be7f-5175e61ac4cd-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.102075 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-textfile\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.102075 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3a522c83-7471-4e5d-be7f-5175e61ac4cd-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.102075 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-wtmp\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.102075 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.101990 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b5f3471-c214-4da6-8284-8a2e35239729-sys\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.102075 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.102039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.102356 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.102342 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b5f3471-c214-4da6-8284-8a2e35239729-metrics-client-ca\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.102489 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.102466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/21e6288b-35f0-467f-85ec-0224e98f6ecf-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.104725 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.104701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.104877 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.104860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.104938 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.104915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/21e6288b-35f0-467f-85ec-0224e98f6ecf-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.105021 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.105003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.114045 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.114024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-accelerators-collector-config\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.116119 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.116097 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98nqt\" (UniqueName: \"kubernetes.io/projected/0b5f3471-c214-4da6-8284-8a2e35239729-kube-api-access-98nqt\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.116726 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.116709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8hw6\" (UniqueName: \"kubernetes.io/projected/21e6288b-35f0-467f-85ec-0224e98f6ecf-kube-api-access-m8hw6\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.117847 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.117825 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scnvz\" (UniqueName: \"kubernetes.io/projected/3a522c83-7471-4e5d-be7f-5175e61ac4cd-kube-api-access-scnvz\") pod \"kube-state-metrics-69db897b98-8dhct\" (UID: \"3a522c83-7471-4e5d-be7f-5175e61ac4cd\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.231168 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.231139 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" Apr 20 13:32:10.348622 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.348593 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8dhct"] Apr 20 13:32:10.351263 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:10.351238 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a522c83_7471_4e5d_be7f_5175e61ac4cd.slice/crio-ac6642628900c3c2f1396055c7a0b41bdb015bfff75c4f8603d25d6eae6e9429 WatchSource:0}: Error finding container ac6642628900c3c2f1396055c7a0b41bdb015bfff75c4f8603d25d6eae6e9429: Status 404 returned error can't find the container with id ac6642628900c3c2f1396055c7a0b41bdb015bfff75c4f8603d25d6eae6e9429 Apr 20 13:32:10.415334 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.415301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" event={"ID":"3a522c83-7471-4e5d-be7f-5175e61ac4cd","Type":"ContainerStarted","Data":"ac6642628900c3c2f1396055c7a0b41bdb015bfff75c4f8603d25d6eae6e9429"} Apr 20 13:32:10.606974 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.606896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-tls\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.606974 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.606952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21e6288b-35f0-467f-85ec-0224e98f6ecf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.609127 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.609103 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0b5f3471-c214-4da6-8284-8a2e35239729-node-exporter-tls\") pod \"node-exporter-k8hjt\" (UID: \"0b5f3471-c214-4da6-8284-8a2e35239729\") " pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.609223 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.609199 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/21e6288b-35f0-467f-85ec-0224e98f6ecf-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-p2jvz\" (UID: \"21e6288b-35f0-467f-85ec-0224e98f6ecf\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.813132 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.813100 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" Apr 20 13:32:10.851277 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.851253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k8hjt" Apr 20 13:32:10.859197 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:10.859146 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b5f3471_c214_4da6_8284_8a2e35239729.slice/crio-c26d47f74e0cc96a3981924df02f5689d008743cde53f6b48a82eb85e1db18dc WatchSource:0}: Error finding container c26d47f74e0cc96a3981924df02f5689d008743cde53f6b48a82eb85e1db18dc: Status 404 returned error can't find the container with id c26d47f74e0cc96a3981924df02f5689d008743cde53f6b48a82eb85e1db18dc Apr 20 13:32:10.961058 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.961027 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz"] Apr 20 13:32:10.966259 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:10.966177 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e6288b_35f0_467f_85ec_0224e98f6ecf.slice/crio-7532a515b950b361fa34e0ac4514d4a64f91c84dfd6db43bfe4e6018b718ee48 WatchSource:0}: Error finding container 7532a515b950b361fa34e0ac4514d4a64f91c84dfd6db43bfe4e6018b718ee48: Status 404 returned error can't find the container with id 7532a515b950b361fa34e0ac4514d4a64f91c84dfd6db43bfe4e6018b718ee48 Apr 20 13:32:10.998512 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:10.998489 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 13:32:11.020616 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.019374 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 13:32:11.020616 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.019566 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.027149 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.026999 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 13:32:11.027149 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.027021 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 13:32:11.027333 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.027314 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 13:32:11.027556 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.027537 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xl7j9\"" Apr 20 13:32:11.028088 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.028067 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 13:32:11.028180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.028135 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 13:32:11.029657 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.028515 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 13:32:11.029657 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.028710 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 13:32:11.029657 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.029278 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 13:32:11.030657 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.029928 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 13:32:11.111316 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111316 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111324 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111408 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111456 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-web-config\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111606 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111652 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-out\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.111849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.111815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lzpl\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-kube-api-access-7lzpl\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.212627 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.212431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.212627 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.212498 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.212838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.212708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-web-config\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.212838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.212765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.212838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.212809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.212994 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.212852 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.212994 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.212915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.212994 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.212943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.212994 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.212967 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-out\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.213878 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.213855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lzpl\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-kube-api-access-7lzpl\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.213960 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.213912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.213960 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.213947 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.214071 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.213989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.214545 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.214474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.214545 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.213856 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.215668 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.215610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.217242 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.217203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-web-config\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.217901 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.217857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.218339 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.218289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-out\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.218440 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.218413 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.218576 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.218553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.219094 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.219073 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.220026 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.219687 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.220294 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.220274 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.221564 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.221546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.223706 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.223666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lzpl\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-kube-api-access-7lzpl\") pod \"alertmanager-main-0\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.337995 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.337962 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:32:11.420884 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.420790 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" event={"ID":"21e6288b-35f0-467f-85ec-0224e98f6ecf","Type":"ContainerStarted","Data":"df87c3737e65045a074ebb255787792bfab45b038f6841799bffc0ad407299fd"} Apr 20 13:32:11.420884 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.420843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" event={"ID":"21e6288b-35f0-467f-85ec-0224e98f6ecf","Type":"ContainerStarted","Data":"471e38f40ae17e72d391dd39c7f296031515eb0ed85a96437d9c645f498a2990"} Apr 20 13:32:11.420884 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.420858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" event={"ID":"21e6288b-35f0-467f-85ec-0224e98f6ecf","Type":"ContainerStarted","Data":"7532a515b950b361fa34e0ac4514d4a64f91c84dfd6db43bfe4e6018b718ee48"} Apr 20 13:32:11.421996 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.421967 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8hjt" event={"ID":"0b5f3471-c214-4da6-8284-8a2e35239729","Type":"ContainerStarted","Data":"c26d47f74e0cc96a3981924df02f5689d008743cde53f6b48a82eb85e1db18dc"} Apr 20 13:32:11.850092 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:11.850069 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 13:32:11.852315 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:11.852292 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2abfc9be_e469_4476_8b64_7fa7acb3f5cf.slice/crio-3d8ee99a86fb96b08b27d2f136fcb9fe2f294a599e3703f66ef040240c69448f WatchSource:0}: Error finding container 3d8ee99a86fb96b08b27d2f136fcb9fe2f294a599e3703f66ef040240c69448f: Status 404 returned error can't find the container with id 3d8ee99a86fb96b08b27d2f136fcb9fe2f294a599e3703f66ef040240c69448f Apr 20 13:32:12.393059 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.393039 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wwvkk" Apr 20 13:32:12.427158 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.427129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" event={"ID":"3a522c83-7471-4e5d-be7f-5175e61ac4cd","Type":"ContainerStarted","Data":"b13f8869fc5f6748eba069e3d4185082a869dc605b74636f47a53aa44fcbe1c6"} Apr 20 13:32:12.428584 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.428556 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerStarted","Data":"3d8ee99a86fb96b08b27d2f136fcb9fe2f294a599e3703f66ef040240c69448f"} Apr 20 13:32:12.970315 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.970241 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-b6688569-k52vh"] Apr 20 13:32:12.994156 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.994104 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b6688569-k52vh"] Apr 20 13:32:12.994320 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.994299 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:12.997127 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.997053 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 13:32:12.997127 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.997053 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 13:32:12.997333 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.997156 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 13:32:12.997333 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.997166 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-vpxrp\"" Apr 20 13:32:12.997333 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.997156 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 13:32:12.997333 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.997054 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 13:32:12.997534 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:12.997436 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5lq75ul6qikfm\"" Apr 20 13:32:13.028844 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.028819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-tls\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.028917 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.028848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.028917 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.028882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.028986 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.028925 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-grpc-tls\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.028986 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.028950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgkzh\" (UniqueName: \"kubernetes.io/projected/60f42529-9f79-4b02-8616-ab3b14916104-kube-api-access-jgkzh\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.029053 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.029023 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.029088 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.029057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.029122 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.029093 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60f42529-9f79-4b02-8616-ab3b14916104-metrics-client-ca\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.129915 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.129873 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.129915 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.129917 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-grpc-tls\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.130180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.129939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgkzh\" (UniqueName: \"kubernetes.io/projected/60f42529-9f79-4b02-8616-ab3b14916104-kube-api-access-jgkzh\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.130180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.129975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.130180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.130008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.130180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.130033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60f42529-9f79-4b02-8616-ab3b14916104-metrics-client-ca\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.130180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.130108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-tls\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.130180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.130134 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.131360 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.131332 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/60f42529-9f79-4b02-8616-ab3b14916104-metrics-client-ca\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.136779 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.133090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.136779 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.133514 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-grpc-tls\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.136779 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.133575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-tls\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.139852 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.139805 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.139947 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.139851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.139947 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.139869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/60f42529-9f79-4b02-8616-ab3b14916104-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.141905 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.141881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgkzh\" (UniqueName: \"kubernetes.io/projected/60f42529-9f79-4b02-8616-ab3b14916104-kube-api-access-jgkzh\") pod \"thanos-querier-b6688569-k52vh\" (UID: \"60f42529-9f79-4b02-8616-ab3b14916104\") " pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.303059 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.303025 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:13.351081 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.351058 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qtnxb" Apr 20 13:32:13.432924 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.432897 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" event={"ID":"3a522c83-7471-4e5d-be7f-5175e61ac4cd","Type":"ContainerStarted","Data":"6c35e96301cb5e9e772fda9827f32a794d0e915b24b555c92d5402228e34fbb4"} Apr 20 13:32:13.699360 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:13.699286 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b6688569-k52vh"] Apr 20 13:32:13.701173 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:13.701147 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f42529_9f79_4b02_8616_ab3b14916104.slice/crio-a2298ab9ae8662ae6e6c151d8abb25e5ada13cbf6dfc164e67f1c6df29ef3905 WatchSource:0}: Error finding container a2298ab9ae8662ae6e6c151d8abb25e5ada13cbf6dfc164e67f1c6df29ef3905: Status 404 returned error can't find the container with id a2298ab9ae8662ae6e6c151d8abb25e5ada13cbf6dfc164e67f1c6df29ef3905 Apr 20 13:32:14.179929 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.179856 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-58fd84c859-2cb5z"] Apr 20 13:32:14.198867 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.198838 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-58fd84c859-2cb5z"] Apr 20 13:32:14.199042 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.198965 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.202545 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.202518 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 13:32:14.202682 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.202576 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-gcb62\"" Apr 20 13:32:14.203357 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.203336 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 13:32:14.203662 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.203380 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 13:32:14.203744 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.203726 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-d42li4b7ctmkm\"" Apr 20 13:32:14.203941 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.203927 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 13:32:14.238703 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.238678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fddfc18-1911-4f8a-bc01-f13e1fee38da-client-ca-bundle\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.238867 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.238817 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fddfc18-1911-4f8a-bc01-f13e1fee38da-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.238933 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.238862 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bwj8\" (UniqueName: \"kubernetes.io/projected/3fddfc18-1911-4f8a-bc01-f13e1fee38da-kube-api-access-6bwj8\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.238987 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.238927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3fddfc18-1911-4f8a-bc01-f13e1fee38da-secret-metrics-server-tls\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.238987 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.238979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3fddfc18-1911-4f8a-bc01-f13e1fee38da-audit-log\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.239101 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.239019 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3fddfc18-1911-4f8a-bc01-f13e1fee38da-metrics-server-audit-profiles\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.239101 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.239049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3fddfc18-1911-4f8a-bc01-f13e1fee38da-secret-metrics-server-client-certs\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.339996 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.339958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fddfc18-1911-4f8a-bc01-f13e1fee38da-client-ca-bundle\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.340167 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.340061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fddfc18-1911-4f8a-bc01-f13e1fee38da-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.340167 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.340101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bwj8\" (UniqueName: \"kubernetes.io/projected/3fddfc18-1911-4f8a-bc01-f13e1fee38da-kube-api-access-6bwj8\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.340167 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.340156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3fddfc18-1911-4f8a-bc01-f13e1fee38da-secret-metrics-server-tls\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.340325 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.340190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3fddfc18-1911-4f8a-bc01-f13e1fee38da-audit-log\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.340325 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.340218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3fddfc18-1911-4f8a-bc01-f13e1fee38da-metrics-server-audit-profiles\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.340325 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.340252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3fddfc18-1911-4f8a-bc01-f13e1fee38da-secret-metrics-server-client-certs\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.340706 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.340655 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3fddfc18-1911-4f8a-bc01-f13e1fee38da-audit-log\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.340868 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.340848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fddfc18-1911-4f8a-bc01-f13e1fee38da-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.341392 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.341368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3fddfc18-1911-4f8a-bc01-f13e1fee38da-metrics-server-audit-profiles\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.342900 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.342871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fddfc18-1911-4f8a-bc01-f13e1fee38da-client-ca-bundle\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.343119 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.343093 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3fddfc18-1911-4f8a-bc01-f13e1fee38da-secret-metrics-server-client-certs\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.343545 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.343526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3fddfc18-1911-4f8a-bc01-f13e1fee38da-secret-metrics-server-tls\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.349743 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.349718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bwj8\" (UniqueName: \"kubernetes.io/projected/3fddfc18-1911-4f8a-bc01-f13e1fee38da-kube-api-access-6bwj8\") pod \"metrics-server-58fd84c859-2cb5z\" (UID: \"3fddfc18-1911-4f8a-bc01-f13e1fee38da\") " pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.437864 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.437782 2573 generic.go:358] "Generic (PLEG): container finished" podID="0b5f3471-c214-4da6-8284-8a2e35239729" containerID="749f6169561a3be70d8806cf020589832fbbaa89cbac40ad8de147dd728644ab" exitCode=0 Apr 20 13:32:14.438293 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.437881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8hjt" event={"ID":"0b5f3471-c214-4da6-8284-8a2e35239729","Type":"ContainerDied","Data":"749f6169561a3be70d8806cf020589832fbbaa89cbac40ad8de147dd728644ab"} Apr 20 13:32:14.439094 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.439069 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" event={"ID":"60f42529-9f79-4b02-8616-ab3b14916104","Type":"ContainerStarted","Data":"a2298ab9ae8662ae6e6c151d8abb25e5ada13cbf6dfc164e67f1c6df29ef3905"} Apr 20 13:32:14.441272 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.441241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" event={"ID":"3a522c83-7471-4e5d-be7f-5175e61ac4cd","Type":"ContainerStarted","Data":"56f27f3dbb5cbcda7202a0c9c9198c098e80d3e2c1d8b5db315245f7a9f9e047"} Apr 20 13:32:14.443668 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.443638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" event={"ID":"21e6288b-35f0-467f-85ec-0224e98f6ecf","Type":"ContainerStarted","Data":"fc4fdaf177843c077629712ac3714b59d5e887bf6bc73013805cb17fe967a459"} Apr 20 13:32:14.445065 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.445042 2573 generic.go:358] "Generic (PLEG): container finished" podID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerID="56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256" exitCode=0 Apr 20 13:32:14.445166 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.445082 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerDied","Data":"56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256"} Apr 20 13:32:14.486528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.486483 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p2jvz" podStartSLOduration=3.146397869 podStartE2EDuration="5.486468227s" podCreationTimestamp="2026-04-20 13:32:09 +0000 UTC" firstStartedPulling="2026-04-20 13:32:11.229316612 +0000 UTC m=+56.759242976" lastFinishedPulling="2026-04-20 13:32:13.56938697 +0000 UTC m=+59.099313334" observedRunningTime="2026-04-20 13:32:14.485680723 +0000 UTC m=+60.015607103" watchObservedRunningTime="2026-04-20 13:32:14.486468227 +0000 UTC m=+60.016394586" Apr 20 13:32:14.510560 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.510533 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:14.520729 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.518161 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-8dhct" podStartSLOduration=3.548374559 podStartE2EDuration="5.518134269s" podCreationTimestamp="2026-04-20 13:32:09 +0000 UTC" firstStartedPulling="2026-04-20 13:32:10.353030563 +0000 UTC m=+55.882956921" lastFinishedPulling="2026-04-20 13:32:12.322790267 +0000 UTC m=+57.852716631" observedRunningTime="2026-04-20 13:32:14.516314138 +0000 UTC m=+60.046240518" watchObservedRunningTime="2026-04-20 13:32:14.518134269 +0000 UTC m=+60.048060651" Apr 20 13:32:14.658242 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.658208 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-58fd84c859-2cb5z"] Apr 20 13:32:14.661694 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:14.661658 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fddfc18_1911_4f8a_bc01_f13e1fee38da.slice/crio-3478494910d26553ae93369786f4f4f6ec65ff1925e6df10c58e90237e31c447 WatchSource:0}: Error finding container 3478494910d26553ae93369786f4f4f6ec65ff1925e6df10c58e90237e31c447: Status 404 returned error can't find the container with id 3478494910d26553ae93369786f4f4f6ec65ff1925e6df10c58e90237e31c447 Apr 20 13:32:14.673311 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.673285 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d"] Apr 20 13:32:14.683703 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.683682 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" Apr 20 13:32:14.686858 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.686802 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 13:32:14.686981 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.686901 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6l4xv\"" Apr 20 13:32:14.688161 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.688102 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d"] Apr 20 13:32:14.744536 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.744498 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c9b945b1-227f-44ca-b322-5d475bfba434-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cxm2d\" (UID: \"c9b945b1-227f-44ca-b322-5d475bfba434\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" Apr 20 13:32:14.845592 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:14.845554 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c9b945b1-227f-44ca-b322-5d475bfba434-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cxm2d\" (UID: \"c9b945b1-227f-44ca-b322-5d475bfba434\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" Apr 20 13:32:14.845773 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:32:14.845731 2573 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 13:32:14.845846 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:32:14.845834 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9b945b1-227f-44ca-b322-5d475bfba434-monitoring-plugin-cert podName:c9b945b1-227f-44ca-b322-5d475bfba434 nodeName:}" failed. No retries permitted until 2026-04-20 13:32:15.345812494 +0000 UTC m=+60.875738857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/c9b945b1-227f-44ca-b322-5d475bfba434-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-cxm2d" (UID: "c9b945b1-227f-44ca-b322-5d475bfba434") : secret "monitoring-plugin-cert" not found Apr 20 13:32:15.352451 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:15.352276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c9b945b1-227f-44ca-b322-5d475bfba434-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cxm2d\" (UID: \"c9b945b1-227f-44ca-b322-5d475bfba434\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" Apr 20 13:32:15.356064 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:15.356013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c9b945b1-227f-44ca-b322-5d475bfba434-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cxm2d\" (UID: \"c9b945b1-227f-44ca-b322-5d475bfba434\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" Apr 20 13:32:15.452364 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:15.452318 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8hjt" event={"ID":"0b5f3471-c214-4da6-8284-8a2e35239729","Type":"ContainerStarted","Data":"5a4b0a74be2f8838c84fe6e9cecd1e7c86abec686001c1daeee5cbf49f64ab73"} Apr 20 13:32:15.452834 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:15.452372 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8hjt" event={"ID":"0b5f3471-c214-4da6-8284-8a2e35239729","Type":"ContainerStarted","Data":"d2a5cf94cb047b3fb60ae9992ab1d2f44e148b665368c1fe81e80c3879f08b40"} Apr 20 13:32:15.453555 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:15.453513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" event={"ID":"3fddfc18-1911-4f8a-bc01-f13e1fee38da","Type":"ContainerStarted","Data":"3478494910d26553ae93369786f4f4f6ec65ff1925e6df10c58e90237e31c447"} Apr 20 13:32:15.477679 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:15.476715 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k8hjt" podStartSLOduration=3.794422475 podStartE2EDuration="6.476695601s" podCreationTimestamp="2026-04-20 13:32:09 +0000 UTC" firstStartedPulling="2026-04-20 13:32:10.860962617 +0000 UTC m=+56.390888976" lastFinishedPulling="2026-04-20 13:32:13.543235736 +0000 UTC m=+59.073162102" observedRunningTime="2026-04-20 13:32:15.474491814 +0000 UTC m=+61.004418197" watchObservedRunningTime="2026-04-20 13:32:15.476695601 +0000 UTC m=+61.006621981" Apr 20 13:32:15.599809 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:15.599779 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-6l4xv\"" Apr 20 13:32:15.607891 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:15.607812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" Apr 20 13:32:16.141322 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.141293 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 13:32:16.146093 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.146069 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.149361 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.148916 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 13:32:16.149361 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.149170 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-h9tlz\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.149885 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.149972 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.150058 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.150102 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.150105 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.149978 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.150169 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.150198 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-fobai5o6bovg8\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.150139 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 13:32:16.150306 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.150235 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 13:32:16.150865 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.150360 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 13:32:16.151864 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.151845 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 13:32:16.159803 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.159742 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 13:32:16.259851 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.259820 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-web-config\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260033 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.259860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260033 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.259900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-config\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260033 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.259957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260204 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260027 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260204 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260076 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260204 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260106 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260204 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260126 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260204 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260204 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260167 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260204 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260213 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260298 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-config-out\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6pj\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-kube-api-access-8r6pj\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260406 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260473 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.260528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.260508 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.361622 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.361583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-web-config\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362009 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.361642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362009 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.361677 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-config\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362009 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.361719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362009 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.361767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362009 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.361793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362009 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.361822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362353 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.362194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362353 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.362251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362353 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.362284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.362353 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.362314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.364429 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.364394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.364546 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.364471 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-config-out\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.364546 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.364506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6pj\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-kube-api-access-8r6pj\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.364638 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.364559 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.364638 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.364603 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.364734 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.364694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.366109 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.364780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.366109 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.365682 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.366109 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.365940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.366517 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.366458 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.369508 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.368075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.369508 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.368943 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.369508 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.369398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.369508 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.369455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.370394 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.370369 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-config\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.374087 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.371540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-web-config\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.374087 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.372089 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.374249 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.374212 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.375189 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.374807 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-config-out\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.375958 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.375937 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.376880 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.376856 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.378015 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.376967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.378015 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.377965 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.378787 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.378763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.379238 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.379217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6pj\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-kube-api-access-8r6pj\") pod \"prometheus-k8s-0\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.459351 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.459261 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:16.829393 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.829210 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 13:32:16.836156 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:16.836107 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd19b1d29_4f03_425e_9146_821f748286fe.slice/crio-cb697fae9e52a1d70219ea6853d049696c7b828e8be1c52faa533a44423e370b WatchSource:0}: Error finding container cb697fae9e52a1d70219ea6853d049696c7b828e8be1c52faa533a44423e370b: Status 404 returned error can't find the container with id cb697fae9e52a1d70219ea6853d049696c7b828e8be1c52faa533a44423e370b Apr 20 13:32:16.845211 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:16.844591 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d"] Apr 20 13:32:16.853282 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:16.853247 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b945b1_227f_44ca_b322_5d475bfba434.slice/crio-683fab74172237721b6da3bdef4e2100e585c6feccfcd9f64bdafb9280d29535 WatchSource:0}: Error finding container 683fab74172237721b6da3bdef4e2100e585c6feccfcd9f64bdafb9280d29535: Status 404 returned error can't find the container with id 683fab74172237721b6da3bdef4e2100e585c6feccfcd9f64bdafb9280d29535 Apr 20 13:32:17.464509 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.464404 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" event={"ID":"3fddfc18-1911-4f8a-bc01-f13e1fee38da","Type":"ContainerStarted","Data":"4e3f8169dda1d21c7dc30b4671b09de0a780a8807a33478c82958327a00bbf7a"} Apr 20 13:32:17.465911 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.465859 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" event={"ID":"c9b945b1-227f-44ca-b322-5d475bfba434","Type":"ContainerStarted","Data":"683fab74172237721b6da3bdef4e2100e585c6feccfcd9f64bdafb9280d29535"} Apr 20 13:32:17.468521 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.468494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" event={"ID":"60f42529-9f79-4b02-8616-ab3b14916104","Type":"ContainerStarted","Data":"d231a7ae75ecd7b408fea454477083a9224ba5982ec331d2498c9ad074a0e252"} Apr 20 13:32:17.468605 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.468536 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" event={"ID":"60f42529-9f79-4b02-8616-ab3b14916104","Type":"ContainerStarted","Data":"4f6dd0e9ff8a60e5323b055533114671e4622f71c051745c6dbd67e8cd8b9b76"} Apr 20 13:32:17.468605 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.468552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" event={"ID":"60f42529-9f79-4b02-8616-ab3b14916104","Type":"ContainerStarted","Data":"51fe8438f9e4f270ed0446d13d5f8e0246b80726e9623fc9d8a4cd104cc5c833"} Apr 20 13:32:17.470523 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.470393 2573 generic.go:358] "Generic (PLEG): container finished" podID="d19b1d29-4f03-425e-9146-821f748286fe" containerID="cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98" exitCode=0 Apr 20 13:32:17.470523 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.470483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerDied","Data":"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98"} Apr 20 13:32:17.470523 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.470515 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerStarted","Data":"cb697fae9e52a1d70219ea6853d049696c7b828e8be1c52faa533a44423e370b"} Apr 20 13:32:17.475040 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.474955 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerStarted","Data":"fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58"} Apr 20 13:32:17.475040 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.474984 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerStarted","Data":"bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa"} Apr 20 13:32:17.475040 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.474998 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerStarted","Data":"9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181"} Apr 20 13:32:17.475040 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.475012 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerStarted","Data":"2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f"} Apr 20 13:32:17.475040 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.475025 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerStarted","Data":"c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1"} Apr 20 13:32:17.491583 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:17.490071 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" podStartSLOduration=1.4244535329999999 podStartE2EDuration="3.49005572s" podCreationTimestamp="2026-04-20 13:32:14 +0000 UTC" firstStartedPulling="2026-04-20 13:32:14.664529756 +0000 UTC m=+60.194456114" lastFinishedPulling="2026-04-20 13:32:16.730131926 +0000 UTC m=+62.260058301" observedRunningTime="2026-04-20 13:32:17.489381517 +0000 UTC m=+63.019307898" watchObservedRunningTime="2026-04-20 13:32:17.49005572 +0000 UTC m=+63.019982102" Apr 20 13:32:18.480029 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.479978 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" event={"ID":"c9b945b1-227f-44ca-b322-5d475bfba434","Type":"ContainerStarted","Data":"055a1f3b122276f0cf2669aed68e9cdb2332c23cdac200081ef5085e312da63f"} Apr 20 13:32:18.480511 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.480478 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" Apr 20 13:32:18.483700 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.483670 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" event={"ID":"60f42529-9f79-4b02-8616-ab3b14916104","Type":"ContainerStarted","Data":"712a530058c9e7edcbecd875b5fef61ba62ec1b095931b570300646223773f2f"} Apr 20 13:32:18.483846 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.483703 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" event={"ID":"60f42529-9f79-4b02-8616-ab3b14916104","Type":"ContainerStarted","Data":"aba5e2d1ed7d37bf52e0a3fab953300f8f6c423527c61a0e75b282f22c9e9971"} Apr 20 13:32:18.483846 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.483716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" event={"ID":"60f42529-9f79-4b02-8616-ab3b14916104","Type":"ContainerStarted","Data":"a5cc690e7abbf0de5d5cb22a4463957746ad315e26bd2e5ac076f5f030e7da7c"} Apr 20 13:32:18.484023 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.484004 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:18.486478 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.486459 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" Apr 20 13:32:18.487808 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.487785 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerStarted","Data":"259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2"} Apr 20 13:32:18.497651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.497604 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cxm2d" podStartSLOduration=3.237491719 podStartE2EDuration="4.497589224s" podCreationTimestamp="2026-04-20 13:32:14 +0000 UTC" firstStartedPulling="2026-04-20 13:32:16.859510816 +0000 UTC m=+62.389437174" lastFinishedPulling="2026-04-20 13:32:18.119608319 +0000 UTC m=+63.649534679" observedRunningTime="2026-04-20 13:32:18.497092479 +0000 UTC m=+64.027018875" watchObservedRunningTime="2026-04-20 13:32:18.497589224 +0000 UTC m=+64.027515605" Apr 20 13:32:18.526267 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.526219 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.261416861 podStartE2EDuration="8.526190552s" podCreationTimestamp="2026-04-20 13:32:10 +0000 UTC" firstStartedPulling="2026-04-20 13:32:11.854188226 +0000 UTC m=+57.384114585" lastFinishedPulling="2026-04-20 13:32:18.118961906 +0000 UTC m=+63.648888276" observedRunningTime="2026-04-20 13:32:18.524892501 +0000 UTC m=+64.054818882" watchObservedRunningTime="2026-04-20 13:32:18.526190552 +0000 UTC m=+64.056116934" Apr 20 13:32:18.549343 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:18.549287 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" podStartSLOduration=2.131667907 podStartE2EDuration="6.549273563s" podCreationTimestamp="2026-04-20 13:32:12 +0000 UTC" firstStartedPulling="2026-04-20 13:32:13.703951153 +0000 UTC m=+59.233877517" lastFinishedPulling="2026-04-20 13:32:18.121556815 +0000 UTC m=+63.651483173" observedRunningTime="2026-04-20 13:32:18.547659783 +0000 UTC m=+64.077586198" watchObservedRunningTime="2026-04-20 13:32:18.549273563 +0000 UTC m=+64.079199943" Apr 20 13:32:20.497596 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:20.497562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerStarted","Data":"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25"} Apr 20 13:32:20.498012 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:20.497602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerStarted","Data":"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c"} Apr 20 13:32:20.810958 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:20.810926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:32:20.814053 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:20.814030 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 13:32:20.823374 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:20.823355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35-metrics-certs\") pod \"network-metrics-daemon-55n9j\" (UID: \"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35\") " pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:32:20.845791 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:20.845730 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gqjcl\"" Apr 20 13:32:20.853616 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:20.853603 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-55n9j" Apr 20 13:32:20.987361 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:20.987282 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-55n9j"] Apr 20 13:32:20.989994 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:20.989967 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5f40fcc_0e0b_4cc2_af17_3085b1f4bf35.slice/crio-f0ebbb8c9b685199b3f038dc3c023cf2b1187986d8d6e642a16bd1601e5224cd WatchSource:0}: Error finding container f0ebbb8c9b685199b3f038dc3c023cf2b1187986d8d6e642a16bd1601e5224cd: Status 404 returned error can't find the container with id f0ebbb8c9b685199b3f038dc3c023cf2b1187986d8d6e642a16bd1601e5224cd Apr 20 13:32:21.013581 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.013552 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:32:21.015869 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.015848 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 13:32:21.026725 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.026699 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 13:32:21.037135 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.037082 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cszcv\" (UniqueName: \"kubernetes.io/projected/e2b1c838-35ef-4d7c-898c-5604961fd9aa-kube-api-access-cszcv\") pod \"network-check-target-mfqvp\" (UID: \"e2b1c838-35ef-4d7c-898c-5604961fd9aa\") " pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:32:21.157389 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.157301 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wf844\"" Apr 20 13:32:21.165116 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.165086 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:32:21.294512 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.294482 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mfqvp"] Apr 20 13:32:21.297302 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:32:21.297277 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2b1c838_35ef_4d7c_898c_5604961fd9aa.slice/crio-258a37f3a522945fc8ba971c5df1a8b4e8a7fb7e464cd81e4eaf42da919f3fa1 WatchSource:0}: Error finding container 258a37f3a522945fc8ba971c5df1a8b4e8a7fb7e464cd81e4eaf42da919f3fa1: Status 404 returned error can't find the container with id 258a37f3a522945fc8ba971c5df1a8b4e8a7fb7e464cd81e4eaf42da919f3fa1 Apr 20 13:32:21.505550 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.505504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerStarted","Data":"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954"} Apr 20 13:32:21.505550 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.505548 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerStarted","Data":"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd"} Apr 20 13:32:21.506041 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.505561 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerStarted","Data":"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b"} Apr 20 13:32:21.506041 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.505575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerStarted","Data":"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87"} Apr 20 13:32:21.507085 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.507048 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-55n9j" event={"ID":"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35","Type":"ContainerStarted","Data":"f0ebbb8c9b685199b3f038dc3c023cf2b1187986d8d6e642a16bd1601e5224cd"} Apr 20 13:32:21.508105 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.508075 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mfqvp" event={"ID":"e2b1c838-35ef-4d7c-898c-5604961fd9aa","Type":"ContainerStarted","Data":"258a37f3a522945fc8ba971c5df1a8b4e8a7fb7e464cd81e4eaf42da919f3fa1"} Apr 20 13:32:21.540153 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:21.540096 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.676587451 podStartE2EDuration="5.540079051s" podCreationTimestamp="2026-04-20 13:32:16 +0000 UTC" firstStartedPulling="2026-04-20 13:32:17.471880376 +0000 UTC m=+63.001806734" lastFinishedPulling="2026-04-20 13:32:20.335371959 +0000 UTC m=+65.865298334" observedRunningTime="2026-04-20 13:32:21.538646293 +0000 UTC m=+67.068572675" watchObservedRunningTime="2026-04-20 13:32:21.540079051 +0000 UTC m=+67.070005436" Apr 20 13:32:22.514962 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:22.514916 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-55n9j" event={"ID":"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35","Type":"ContainerStarted","Data":"3aaf5a9d9e7eef5cc9f1d8944de37a184c7523bec355cddc7e050980fd47d259"} Apr 20 13:32:22.514962 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:22.514966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-55n9j" event={"ID":"e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35","Type":"ContainerStarted","Data":"f6cfec7c6c078ed0e43907f910c6823b179d6b92ae37047cc4a55801a86917e7"} Apr 20 13:32:22.536463 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:22.535446 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-55n9j" podStartSLOduration=66.490810634 podStartE2EDuration="1m7.535427732s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:32:20.99210045 +0000 UTC m=+66.522026815" lastFinishedPulling="2026-04-20 13:32:22.036717551 +0000 UTC m=+67.566643913" observedRunningTime="2026-04-20 13:32:22.53394551 +0000 UTC m=+68.063871891" watchObservedRunningTime="2026-04-20 13:32:22.535427732 +0000 UTC m=+68.065354113" Apr 20 13:32:24.496964 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:24.496939 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-b6688569-k52vh" Apr 20 13:32:24.523985 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:24.523954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mfqvp" event={"ID":"e2b1c838-35ef-4d7c-898c-5604961fd9aa","Type":"ContainerStarted","Data":"60a2939b601ce297d845c214da4d119c4a5a5e0ba4ef5d6a4be3c7b6de7ae77d"} Apr 20 13:32:24.524130 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:24.524101 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:32:24.541588 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:24.541537 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mfqvp" podStartSLOduration=66.694308095 podStartE2EDuration="1m9.541521921s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:32:21.29926687 +0000 UTC m=+66.829193229" lastFinishedPulling="2026-04-20 13:32:24.146480695 +0000 UTC m=+69.676407055" observedRunningTime="2026-04-20 13:32:24.540364937 +0000 UTC m=+70.070291318" watchObservedRunningTime="2026-04-20 13:32:24.541521921 +0000 UTC m=+70.071448303" Apr 20 13:32:26.459980 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:26.459930 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:32:34.510910 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:34.510879 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:34.510910 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:34.510916 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:41.431435 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:41.431407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2abfc9be-e469-4476-8b64-7fa7acb3f5cf/init-config-reloader/0.log" Apr 20 13:32:41.438558 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:41.438537 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2abfc9be-e469-4476-8b64-7fa7acb3f5cf/alertmanager/0.log" Apr 20 13:32:41.584464 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:41.584437 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2abfc9be-e469-4476-8b64-7fa7acb3f5cf/config-reloader/0.log" Apr 20 13:32:41.784694 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:41.784664 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2abfc9be-e469-4476-8b64-7fa7acb3f5cf/kube-rbac-proxy-web/0.log" Apr 20 13:32:41.984391 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:41.984318 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2abfc9be-e469-4476-8b64-7fa7acb3f5cf/kube-rbac-proxy/0.log" Apr 20 13:32:42.184477 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:42.184410 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2abfc9be-e469-4476-8b64-7fa7acb3f5cf/kube-rbac-proxy-metric/0.log" Apr 20 13:32:42.384068 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:42.384043 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_2abfc9be-e469-4476-8b64-7fa7acb3f5cf/prom-label-proxy/0.log" Apr 20 13:32:42.784608 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:42.784580 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8dhct_3a522c83-7471-4e5d-be7f-5175e61ac4cd/kube-state-metrics/0.log" Apr 20 13:32:42.983959 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:42.983934 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8dhct_3a522c83-7471-4e5d-be7f-5175e61ac4cd/kube-rbac-proxy-main/0.log" Apr 20 13:32:43.183727 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:43.183659 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8dhct_3a522c83-7471-4e5d-be7f-5175e61ac4cd/kube-rbac-proxy-self/0.log" Apr 20 13:32:43.385704 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:43.385679 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-58fd84c859-2cb5z_3fddfc18-1911-4f8a-bc01-f13e1fee38da/metrics-server/0.log" Apr 20 13:32:43.583830 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:43.583802 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-cxm2d_c9b945b1-227f-44ca-b322-5d475bfba434/monitoring-plugin/0.log" Apr 20 13:32:44.383251 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:44.383225 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8hjt_0b5f3471-c214-4da6-8284-8a2e35239729/init-textfile/0.log" Apr 20 13:32:44.585179 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:44.585155 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8hjt_0b5f3471-c214-4da6-8284-8a2e35239729/node-exporter/0.log" Apr 20 13:32:44.783479 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:44.783437 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8hjt_0b5f3471-c214-4da6-8284-8a2e35239729/kube-rbac-proxy/0.log" Apr 20 13:32:45.585991 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:45.585957 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p2jvz_21e6288b-35f0-467f-85ec-0224e98f6ecf/kube-rbac-proxy-main/0.log" Apr 20 13:32:45.791057 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:45.791029 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p2jvz_21e6288b-35f0-467f-85ec-0224e98f6ecf/kube-rbac-proxy-self/0.log" Apr 20 13:32:45.984605 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:45.984579 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p2jvz_21e6288b-35f0-467f-85ec-0224e98f6ecf/openshift-state-metrics/0.log" Apr 20 13:32:46.183786 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:46.183739 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d19b1d29-4f03-425e-9146-821f748286fe/init-config-reloader/0.log" Apr 20 13:32:46.385204 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:46.385130 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d19b1d29-4f03-425e-9146-821f748286fe/prometheus/0.log" Apr 20 13:32:46.584319 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:46.584290 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d19b1d29-4f03-425e-9146-821f748286fe/config-reloader/0.log" Apr 20 13:32:46.785112 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:46.785082 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d19b1d29-4f03-425e-9146-821f748286fe/thanos-sidecar/0.log" Apr 20 13:32:46.984192 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:46.984164 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d19b1d29-4f03-425e-9146-821f748286fe/kube-rbac-proxy-web/0.log" Apr 20 13:32:47.191984 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:47.191911 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d19b1d29-4f03-425e-9146-821f748286fe/kube-rbac-proxy/0.log" Apr 20 13:32:47.388311 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:47.388285 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d19b1d29-4f03-425e-9146-821f748286fe/kube-rbac-proxy-thanos/0.log" Apr 20 13:32:47.585765 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:47.585724 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-96m7g_fec29c32-4927-4343-91dc-5b24cc32dd2a/prometheus-operator/0.log" Apr 20 13:32:47.783885 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:47.783855 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-96m7g_fec29c32-4927-4343-91dc-5b24cc32dd2a/kube-rbac-proxy/0.log" Apr 20 13:32:47.983704 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:47.983673 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-7x7xq_77018601-43ed-4d16-b80f-22d590fbb6ea/prometheus-operator-admission-webhook/0.log" Apr 20 13:32:48.184123 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:48.184081 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/thanos-query/0.log" Apr 20 13:32:48.384735 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:48.384661 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/kube-rbac-proxy-web/0.log" Apr 20 13:32:48.584925 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:48.584895 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/kube-rbac-proxy/0.log" Apr 20 13:32:48.786262 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:48.786236 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/prom-label-proxy/0.log" Apr 20 13:32:48.983933 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:48.983906 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/kube-rbac-proxy-rules/0.log" Apr 20 13:32:49.188586 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:49.188511 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/kube-rbac-proxy-metrics/0.log" Apr 20 13:32:54.516528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:54.516498 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:54.520460 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:54.520436 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-58fd84c859-2cb5z" Apr 20 13:32:55.530298 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:32:55.530243 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mfqvp" Apr 20 13:33:16.460139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:16.460096 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:16.479014 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:16.478988 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:16.694328 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:16.694299 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:30.301340 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.301303 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 13:33:30.301842 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.301737 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="alertmanager" containerID="cri-o://c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1" gracePeriod=120 Apr 20 13:33:30.301842 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.301793 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy-metric" containerID="cri-o://fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58" gracePeriod=120 Apr 20 13:33:30.301966 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.301813 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy-web" containerID="cri-o://9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181" gracePeriod=120 Apr 20 13:33:30.301966 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.301840 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy" containerID="cri-o://bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa" gracePeriod=120 Apr 20 13:33:30.301966 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.301864 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="prom-label-proxy" containerID="cri-o://259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2" gracePeriod=120 Apr 20 13:33:30.301966 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.301817 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="config-reloader" containerID="cri-o://2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f" gracePeriod=120 Apr 20 13:33:30.720824 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.720722 2573 generic.go:358] "Generic (PLEG): container finished" podID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerID="259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2" exitCode=0 Apr 20 13:33:30.720824 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.720764 2573 generic.go:358] "Generic (PLEG): container finished" podID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerID="bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa" exitCode=0 Apr 20 13:33:30.720824 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.720788 2573 generic.go:358] "Generic (PLEG): container finished" podID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerID="2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f" exitCode=0 Apr 20 13:33:30.720824 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.720795 2573 generic.go:358] "Generic (PLEG): container finished" podID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerID="c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1" exitCode=0 Apr 20 13:33:30.720824 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.720797 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerDied","Data":"259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2"} Apr 20 13:33:30.720824 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.720830 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerDied","Data":"bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa"} Apr 20 13:33:30.721112 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.720839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerDied","Data":"2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f"} Apr 20 13:33:30.721112 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:30.720848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerDied","Data":"c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1"} Apr 20 13:33:31.535819 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.535795 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.611273 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611240 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-volume\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611458 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611294 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-trusted-ca-bundle\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611458 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611333 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-out\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611458 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611360 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-cluster-tls-config\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611458 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611389 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611458 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611455 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-main-db\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611699 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611479 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-metrics-client-ca\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611699 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611510 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-web-config\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611699 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611543 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-main-tls\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611699 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611576 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611699 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611602 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-tls-assets\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611699 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611651 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611953 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611700 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lzpl\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-kube-api-access-7lzpl\") pod \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\" (UID: \"2abfc9be-e469-4476-8b64-7fa7acb3f5cf\") " Apr 20 13:33:31.611953 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.611898 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:33:31.612057 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.612040 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-main-db\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.612415 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.612345 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:31.613097 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.613055 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:31.615096 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.615058 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-out" (OuterVolumeSpecName: "config-out") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:33:31.615902 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.615866 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:31.616579 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.616528 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:31.616785 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.616736 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-kube-api-access-7lzpl" (OuterVolumeSpecName: "kube-api-access-7lzpl") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "kube-api-access-7lzpl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:33:31.617097 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.617021 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:33:31.617323 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.617289 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:31.617527 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.617502 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:31.617821 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.617792 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:31.621469 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.621443 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:31.629222 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.629199 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-web-config" (OuterVolumeSpecName: "web-config") pod "2abfc9be-e469-4476-8b64-7fa7acb3f5cf" (UID: "2abfc9be-e469-4476-8b64-7fa7acb3f5cf"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:31.712817 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712727 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7lzpl\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-kube-api-access-7lzpl\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.712950 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712916 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-volume\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.712950 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712929 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.712950 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712940 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-config-out\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.712950 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712949 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-cluster-tls-config\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.713074 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712959 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.713074 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712969 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-metrics-client-ca\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.713074 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712979 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-web-config\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.713074 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712988 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-main-tls\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.713074 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.712996 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.713074 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.713005 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-tls-assets\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.713074 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.713013 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2abfc9be-e469-4476-8b64-7fa7acb3f5cf-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:31.726538 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.726510 2573 generic.go:358] "Generic (PLEG): container finished" podID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerID="fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58" exitCode=0 Apr 20 13:33:31.726538 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.726533 2573 generic.go:358] "Generic (PLEG): container finished" podID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerID="9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181" exitCode=0 Apr 20 13:33:31.726723 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.726555 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerDied","Data":"fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58"} Apr 20 13:33:31.726723 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.726586 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerDied","Data":"9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181"} Apr 20 13:33:31.726723 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.726597 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2abfc9be-e469-4476-8b64-7fa7acb3f5cf","Type":"ContainerDied","Data":"3d8ee99a86fb96b08b27d2f136fcb9fe2f294a599e3703f66ef040240c69448f"} Apr 20 13:33:31.726723 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.726618 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.726723 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.726622 2573 scope.go:117] "RemoveContainer" containerID="259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2" Apr 20 13:33:31.734274 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.734255 2573 scope.go:117] "RemoveContainer" containerID="fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58" Apr 20 13:33:31.740725 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.740708 2573 scope.go:117] "RemoveContainer" containerID="bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa" Apr 20 13:33:31.746857 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.746842 2573 scope.go:117] "RemoveContainer" containerID="9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181" Apr 20 13:33:31.752892 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.752863 2573 scope.go:117] "RemoveContainer" containerID="2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f" Apr 20 13:33:31.758483 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.758464 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 13:33:31.759647 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.759631 2573 scope.go:117] "RemoveContainer" containerID="c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1" Apr 20 13:33:31.764358 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.764337 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 13:33:31.766846 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.766799 2573 scope.go:117] "RemoveContainer" containerID="56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256" Apr 20 13:33:31.773239 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.773216 2573 scope.go:117] "RemoveContainer" containerID="259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2" Apr 20 13:33:31.773482 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:31.773463 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2\": container with ID starting with 259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2 not found: ID does not exist" containerID="259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2" Apr 20 13:33:31.773553 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.773487 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2"} err="failed to get container status \"259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2\": rpc error: code = NotFound desc = could not find container \"259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2\": container with ID starting with 259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2 not found: ID does not exist" Apr 20 13:33:31.773553 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.773523 2573 scope.go:117] "RemoveContainer" containerID="fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58" Apr 20 13:33:31.773695 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:31.773682 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58\": container with ID starting with fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58 not found: ID does not exist" containerID="fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58" Apr 20 13:33:31.773728 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.773699 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58"} err="failed to get container status \"fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58\": rpc error: code = NotFound desc = could not find container \"fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58\": container with ID starting with fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58 not found: ID does not exist" Apr 20 13:33:31.773728 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.773711 2573 scope.go:117] "RemoveContainer" containerID="bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa" Apr 20 13:33:31.773905 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:31.773891 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa\": container with ID starting with bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa not found: ID does not exist" containerID="bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa" Apr 20 13:33:31.773944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.773908 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa"} err="failed to get container status \"bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa\": rpc error: code = NotFound desc = could not find container \"bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa\": container with ID starting with bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa not found: ID does not exist" Apr 20 13:33:31.773944 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.773924 2573 scope.go:117] "RemoveContainer" containerID="9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181" Apr 20 13:33:31.774118 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:31.774099 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181\": container with ID starting with 9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181 not found: ID does not exist" containerID="9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181" Apr 20 13:33:31.774153 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.774126 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181"} err="failed to get container status \"9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181\": rpc error: code = NotFound desc = could not find container \"9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181\": container with ID starting with 9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181 not found: ID does not exist" Apr 20 13:33:31.774153 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.774143 2573 scope.go:117] "RemoveContainer" containerID="2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f" Apr 20 13:33:31.774394 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:31.774378 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f\": container with ID starting with 2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f not found: ID does not exist" containerID="2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f" Apr 20 13:33:31.774430 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.774408 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f"} err="failed to get container status \"2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f\": rpc error: code = NotFound desc = could not find container \"2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f\": container with ID starting with 2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f not found: ID does not exist" Apr 20 13:33:31.774430 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.774422 2573 scope.go:117] "RemoveContainer" containerID="c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1" Apr 20 13:33:31.774629 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:31.774612 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1\": container with ID starting with c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1 not found: ID does not exist" containerID="c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1" Apr 20 13:33:31.774662 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.774633 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1"} err="failed to get container status \"c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1\": rpc error: code = NotFound desc = could not find container \"c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1\": container with ID starting with c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1 not found: ID does not exist" Apr 20 13:33:31.774662 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.774645 2573 scope.go:117] "RemoveContainer" containerID="56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256" Apr 20 13:33:31.774855 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:31.774840 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256\": container with ID starting with 56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256 not found: ID does not exist" containerID="56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256" Apr 20 13:33:31.774899 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.774859 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256"} err="failed to get container status \"56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256\": rpc error: code = NotFound desc = could not find container \"56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256\": container with ID starting with 56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256 not found: ID does not exist" Apr 20 13:33:31.774899 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.774873 2573 scope.go:117] "RemoveContainer" containerID="259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2" Apr 20 13:33:31.775086 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775070 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2"} err="failed to get container status \"259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2\": rpc error: code = NotFound desc = could not find container \"259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2\": container with ID starting with 259e6d290c1dcbc3ae9cdbbe9efa29ec3a84e7486e86dc923db5e5e59aff27b2 not found: ID does not exist" Apr 20 13:33:31.775122 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775088 2573 scope.go:117] "RemoveContainer" containerID="fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58" Apr 20 13:33:31.775307 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775287 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58"} err="failed to get container status \"fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58\": rpc error: code = NotFound desc = could not find container \"fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58\": container with ID starting with fb64582fd294387193efb2e5b5c352098f482dcc8d82254f0a4504d4238eab58 not found: ID does not exist" Apr 20 13:33:31.775353 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775308 2573 scope.go:117] "RemoveContainer" containerID="bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa" Apr 20 13:33:31.775530 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775508 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa"} err="failed to get container status \"bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa\": rpc error: code = NotFound desc = could not find container \"bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa\": container with ID starting with bae4636f7b6510a3386495cc4ddc4935a9c3664f1ea7f4e0b3411e62af7adbfa not found: ID does not exist" Apr 20 13:33:31.775530 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775529 2573 scope.go:117] "RemoveContainer" containerID="9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181" Apr 20 13:33:31.775740 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775718 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181"} err="failed to get container status \"9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181\": rpc error: code = NotFound desc = could not find container \"9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181\": container with ID starting with 9921b67fd7f35ebe7cc882ef738061914998f4e613f8328e423e75e230eb2181 not found: ID does not exist" Apr 20 13:33:31.775829 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775741 2573 scope.go:117] "RemoveContainer" containerID="2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f" Apr 20 13:33:31.775968 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775953 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f"} err="failed to get container status \"2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f\": rpc error: code = NotFound desc = could not find container \"2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f\": container with ID starting with 2800d195f6aef511440338951480f6b5a8a87b534abff67b379889eec2c7e33f not found: ID does not exist" Apr 20 13:33:31.775968 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.775967 2573 scope.go:117] "RemoveContainer" containerID="c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1" Apr 20 13:33:31.776164 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.776146 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1"} err="failed to get container status \"c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1\": rpc error: code = NotFound desc = could not find container \"c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1\": container with ID starting with c6ec96733899c0ca63f5841650e2a8cd553eb89bfa9a25f8f41f0e80a6851db1 not found: ID does not exist" Apr 20 13:33:31.776202 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.776165 2573 scope.go:117] "RemoveContainer" containerID="56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256" Apr 20 13:33:31.776347 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.776333 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256"} err="failed to get container status \"56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256\": rpc error: code = NotFound desc = could not find container \"56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256\": container with ID starting with 56afb8eb3df72ee14fdebb32ae7951a8525d576057e7796adbb9ed8c01157256 not found: ID does not exist" Apr 20 13:33:31.820355 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820327 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 13:33:31.820662 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820649 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="prom-label-proxy" Apr 20 13:33:31.820703 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820664 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="prom-label-proxy" Apr 20 13:33:31.820703 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820676 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="init-config-reloader" Apr 20 13:33:31.820703 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820681 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="init-config-reloader" Apr 20 13:33:31.820703 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820691 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="config-reloader" Apr 20 13:33:31.820703 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820700 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="config-reloader" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820710 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="alertmanager" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820716 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="alertmanager" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820723 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy-metric" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820728 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy-metric" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820737 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy-web" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820742 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy-web" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820764 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820771 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820815 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820826 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy-metric" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820832 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="prom-label-proxy" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820839 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="alertmanager" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820846 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="kube-rbac-proxy-web" Apr 20 13:33:31.820881 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.820857 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" containerName="config-reloader" Apr 20 13:33:31.825526 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.825511 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.830093 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.830064 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 13:33:31.830093 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.830064 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 13:33:31.830226 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.830065 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 13:33:31.830800 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.830783 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 13:33:31.830907 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.830887 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xl7j9\"" Apr 20 13:33:31.830970 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.830910 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 13:33:31.831037 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.831018 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 13:33:31.831714 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.831701 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 13:33:31.835856 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.835839 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 13:33:31.843136 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.843119 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 13:33:31.860879 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.860854 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 13:33:31.914788 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.914741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ce3f9145-cf57-4f49-8343-3564aac75046-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.914916 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.914795 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.914916 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.914843 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-web-config\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.914916 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.914880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-config-volume\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.914916 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.914899 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.915056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.914928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce3f9145-cf57-4f49-8343-3564aac75046-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.915056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.914945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.915056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.914994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.915056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.915011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btg8q\" (UniqueName: \"kubernetes.io/projected/ce3f9145-cf57-4f49-8343-3564aac75046-kube-api-access-btg8q\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.915056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.915030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.915274 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.915068 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce3f9145-cf57-4f49-8343-3564aac75046-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.915274 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.915115 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce3f9145-cf57-4f49-8343-3564aac75046-config-out\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:31.915274 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:31.915178 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce3f9145-cf57-4f49-8343-3564aac75046-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.015571 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce3f9145-cf57-4f49-8343-3564aac75046-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.015707 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce3f9145-cf57-4f49-8343-3564aac75046-config-out\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.015707 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce3f9145-cf57-4f49-8343-3564aac75046-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.015707 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ce3f9145-cf57-4f49-8343-3564aac75046-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.015886 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.015886 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015831 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-web-config\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.015886 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015873 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-config-volume\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.016026 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.016026 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce3f9145-cf57-4f49-8343-3564aac75046-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.016026 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.015977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.016026 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.016018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.016229 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.016046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btg8q\" (UniqueName: \"kubernetes.io/projected/ce3f9145-cf57-4f49-8343-3564aac75046-kube-api-access-btg8q\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.016229 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.016073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.016229 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.016105 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ce3f9145-cf57-4f49-8343-3564aac75046-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.016620 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.016591 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce3f9145-cf57-4f49-8343-3564aac75046-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.017253 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.017225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce3f9145-cf57-4f49-8343-3564aac75046-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.018651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.018623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce3f9145-cf57-4f49-8343-3564aac75046-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.018651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.018642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce3f9145-cf57-4f49-8343-3564aac75046-config-out\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.018822 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.018725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.018890 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.018851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.018890 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.018863 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.019117 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.019093 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-web-config\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.019417 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.019398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.019815 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.019798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-config-volume\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.020518 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.020503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ce3f9145-cf57-4f49-8343-3564aac75046-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.029408 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.029390 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btg8q\" (UniqueName: \"kubernetes.io/projected/ce3f9145-cf57-4f49-8343-3564aac75046-kube-api-access-btg8q\") pod \"alertmanager-main-0\" (UID: \"ce3f9145-cf57-4f49-8343-3564aac75046\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.134573 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.134543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 13:33:32.283323 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.283247 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 13:33:32.297879 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:33:32.297855 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3f9145_cf57_4f49_8343_3564aac75046.slice/crio-9574cf99c7165b332b50e662008c4fb9ea2604d9f54fa75003403cf6ec8377aa WatchSource:0}: Error finding container 9574cf99c7165b332b50e662008c4fb9ea2604d9f54fa75003403cf6ec8377aa: Status 404 returned error can't find the container with id 9574cf99c7165b332b50e662008c4fb9ea2604d9f54fa75003403cf6ec8377aa Apr 20 13:33:32.730905 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.730876 2573 generic.go:358] "Generic (PLEG): container finished" podID="ce3f9145-cf57-4f49-8343-3564aac75046" containerID="b7ced17b490454f39382fc1e33ccecf3d92038b1d6c4a6e2d42c88c334d65fa3" exitCode=0 Apr 20 13:33:32.731234 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.730911 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce3f9145-cf57-4f49-8343-3564aac75046","Type":"ContainerDied","Data":"b7ced17b490454f39382fc1e33ccecf3d92038b1d6c4a6e2d42c88c334d65fa3"} Apr 20 13:33:32.731234 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:32.730931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce3f9145-cf57-4f49-8343-3564aac75046","Type":"ContainerStarted","Data":"9574cf99c7165b332b50e662008c4fb9ea2604d9f54fa75003403cf6ec8377aa"} Apr 20 13:33:33.134847 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:33.134815 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2abfc9be-e469-4476-8b64-7fa7acb3f5cf" path="/var/lib/kubelet/pods/2abfc9be-e469-4476-8b64-7fa7acb3f5cf/volumes" Apr 20 13:33:33.737892 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:33.737854 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce3f9145-cf57-4f49-8343-3564aac75046","Type":"ContainerStarted","Data":"69b20d20c1b0bab31cc7cd9d388f2c93d09f61da17067837c6099e4c0a8b4403"} Apr 20 13:33:33.737892 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:33.737891 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce3f9145-cf57-4f49-8343-3564aac75046","Type":"ContainerStarted","Data":"33e88b37efaf35472ea7fee140749d13cb5018d866ec78e6e1f4f72af3b289d2"} Apr 20 13:33:33.738297 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:33.737903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce3f9145-cf57-4f49-8343-3564aac75046","Type":"ContainerStarted","Data":"c1e990ed52c1cba6a627229b8999b7733835f9ff8bef22c68fc51bbd6c8a791a"} Apr 20 13:33:33.738297 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:33.737912 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce3f9145-cf57-4f49-8343-3564aac75046","Type":"ContainerStarted","Data":"3c0fb7f49538d23a09bd99c873ace18523e2f58a375a3cbaf70cfa26571ad179"} Apr 20 13:33:33.738297 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:33.737920 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce3f9145-cf57-4f49-8343-3564aac75046","Type":"ContainerStarted","Data":"b09429ee8c8e09b5c9818a99360e066757c64524eb6516a55f26018078bdbece"} Apr 20 13:33:33.738297 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:33.737927 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce3f9145-cf57-4f49-8343-3564aac75046","Type":"ContainerStarted","Data":"686697e3e805777ed702bcbd5d37ba2d37470001d440b06494ef81e050f1ab4c"} Apr 20 13:33:34.416443 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.416390 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.416371736 podStartE2EDuration="3.416371736s" podCreationTimestamp="2026-04-20 13:33:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:33:33.80238148 +0000 UTC m=+139.332307859" watchObservedRunningTime="2026-04-20 13:33:34.416371736 +0000 UTC m=+139.946298107" Apr 20 13:33:34.416817 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.416801 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6d64487d4b-lmdwc"] Apr 20 13:33:34.420367 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.420354 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.446170 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.446144 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 13:33:34.446170 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.446159 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 13:33:34.446331 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.446252 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 13:33:34.446459 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.446445 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 13:33:34.446527 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.446512 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-5kfzk\"" Apr 20 13:33:34.446983 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.446967 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 13:33:34.453015 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.452995 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 13:33:34.453131 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.453113 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d64487d4b-lmdwc"] Apr 20 13:33:34.536170 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.536127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eece39e6-7f69-43d9-9ef3-312fa1419532-serving-certs-ca-bundle\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.536170 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.536171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-federate-client-tls\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.536400 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.536198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.536400 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.536232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-secret-telemeter-client\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.536400 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.536275 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-telemeter-client-tls\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.536400 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.536299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eece39e6-7f69-43d9-9ef3-312fa1419532-metrics-client-ca\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.536400 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.536346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvlp\" (UniqueName: \"kubernetes.io/projected/eece39e6-7f69-43d9-9ef3-312fa1419532-kube-api-access-wkvlp\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.536400 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.536373 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eece39e6-7f69-43d9-9ef3-312fa1419532-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.637030 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.636997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvlp\" (UniqueName: \"kubernetes.io/projected/eece39e6-7f69-43d9-9ef3-312fa1419532-kube-api-access-wkvlp\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.637030 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.637033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eece39e6-7f69-43d9-9ef3-312fa1419532-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.637234 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.637076 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eece39e6-7f69-43d9-9ef3-312fa1419532-serving-certs-ca-bundle\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.637234 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.637092 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-federate-client-tls\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.637234 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.637108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.637234 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.637168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-secret-telemeter-client\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.637234 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.637210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-telemeter-client-tls\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.637483 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.637240 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eece39e6-7f69-43d9-9ef3-312fa1419532-metrics-client-ca\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.638048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.638015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/eece39e6-7f69-43d9-9ef3-312fa1419532-metrics-client-ca\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.638217 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.638194 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eece39e6-7f69-43d9-9ef3-312fa1419532-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.638307 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.638199 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eece39e6-7f69-43d9-9ef3-312fa1419532-serving-certs-ca-bundle\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.640202 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.640179 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.640382 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.640363 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-secret-telemeter-client\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.640458 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.640375 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-telemeter-client-tls\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.640458 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.640379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/eece39e6-7f69-43d9-9ef3-312fa1419532-federate-client-tls\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.649383 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.649356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvlp\" (UniqueName: \"kubernetes.io/projected/eece39e6-7f69-43d9-9ef3-312fa1419532-kube-api-access-wkvlp\") pod \"telemeter-client-6d64487d4b-lmdwc\" (UID: \"eece39e6-7f69-43d9-9ef3-312fa1419532\") " pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.727855 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.727767 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 13:33:34.728247 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.728222 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="prometheus" containerID="cri-o://c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" gracePeriod=600 Apr 20 13:33:34.728325 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.728269 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="config-reloader" containerID="cri-o://8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" gracePeriod=600 Apr 20 13:33:34.728325 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.728260 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy" containerID="cri-o://369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" gracePeriod=600 Apr 20 13:33:34.728325 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.728307 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy-web" containerID="cri-o://7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" gracePeriod=600 Apr 20 13:33:34.728466 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.728278 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="thanos-sidecar" containerID="cri-o://f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" gracePeriod=600 Apr 20 13:33:34.728466 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.728401 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy-thanos" containerID="cri-o://bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" gracePeriod=600 Apr 20 13:33:34.728576 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.728561 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" Apr 20 13:33:34.896145 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.896111 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d64487d4b-lmdwc"] Apr 20 13:33:34.899536 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:33:34.899493 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeece39e6_7f69_43d9_9ef3_312fa1419532.slice/crio-8b9318bbb33830bd37027f11e09a4d7531dd526ae58369c44a11eb1ac73965e8 WatchSource:0}: Error finding container 8b9318bbb33830bd37027f11e09a4d7531dd526ae58369c44a11eb1ac73965e8: Status 404 returned error can't find the container with id 8b9318bbb33830bd37027f11e09a4d7531dd526ae58369c44a11eb1ac73965e8 Apr 20 13:33:34.981265 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:34.981248 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.144287 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144260 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-trusted-ca-bundle\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144439 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144296 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-metrics-client-certs\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144439 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144332 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-grpc-tls\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144439 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144358 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-config-out\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144439 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144385 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144439 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144415 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-rulefiles-0\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144678 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144449 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-metrics-client-ca\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144678 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144577 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144678 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144629 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-web-config\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144678 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144660 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-tls\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144890 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144687 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-config\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144890 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144724 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-kube-rbac-proxy\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144890 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144774 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:35.144890 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144787 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-serving-certs-ca-bundle\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.144890 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144846 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r6pj\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-kube-api-access-8r6pj\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.145134 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.144897 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-db\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.145134 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.145050 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-tls-assets\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.145134 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.145078 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-thanos-prometheus-http-client-file\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.145460 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.145431 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:35.145555 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.145487 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-kubelet-serving-ca-bundle\") pod \"d19b1d29-4f03-425e-9146-821f748286fe\" (UID: \"d19b1d29-4f03-425e-9146-821f748286fe\") " Apr 20 13:33:35.145785 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.145768 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.145868 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.145792 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-metrics-client-ca\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.145928 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.145886 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:35.146042 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.146020 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:35.146247 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.146225 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:33:35.147645 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.147613 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:33:35.147955 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.147929 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:35.151000 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.148323 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:35.151000 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.148369 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:35.151000 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.148383 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:35.151000 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.150900 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:35.151000 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.150967 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:35.151000 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.150982 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-config-out" (OuterVolumeSpecName: "config-out") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:33:35.151000 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.150996 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:33:35.151394 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.151036 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:35.151394 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.151291 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-config" (OuterVolumeSpecName: "config") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:35.152148 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.152123 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-kube-api-access-8r6pj" (OuterVolumeSpecName: "kube-api-access-8r6pj") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "kube-api-access-8r6pj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:33:35.162256 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.162232 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-web-config" (OuterVolumeSpecName: "web-config") pod "d19b1d29-4f03-425e-9146-821f748286fe" (UID: "d19b1d29-4f03-425e-9146-821f748286fe"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:33:35.247055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.246982 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-tls-assets\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247007 2573 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247017 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247026 2573 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-metrics-client-certs\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247036 2573 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-grpc-tls\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247045 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-config-out\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247054 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247055 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247063 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247372 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247072 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247372 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247082 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-web-config\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247372 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247093 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247372 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247101 2573 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-config\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247372 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247109 2573 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d19b1d29-4f03-425e-9146-821f748286fe-secret-kube-rbac-proxy\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247372 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247118 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19b1d29-4f03-425e-9146-821f748286fe-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247372 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247126 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8r6pj\" (UniqueName: \"kubernetes.io/projected/d19b1d29-4f03-425e-9146-821f748286fe-kube-api-access-8r6pj\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.247372 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.247134 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d19b1d29-4f03-425e-9146-821f748286fe-prometheus-k8s-db\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:33:35.748680 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.748644 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" event={"ID":"eece39e6-7f69-43d9-9ef3-312fa1419532","Type":"ContainerStarted","Data":"8b9318bbb33830bd37027f11e09a4d7531dd526ae58369c44a11eb1ac73965e8"} Apr 20 13:33:35.751183 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751155 2573 generic.go:358] "Generic (PLEG): container finished" podID="d19b1d29-4f03-425e-9146-821f748286fe" containerID="bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" exitCode=0 Apr 20 13:33:35.751183 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751182 2573 generic.go:358] "Generic (PLEG): container finished" podID="d19b1d29-4f03-425e-9146-821f748286fe" containerID="369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" exitCode=0 Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751191 2573 generic.go:358] "Generic (PLEG): container finished" podID="d19b1d29-4f03-425e-9146-821f748286fe" containerID="7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" exitCode=0 Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751197 2573 generic.go:358] "Generic (PLEG): container finished" podID="d19b1d29-4f03-425e-9146-821f748286fe" containerID="f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" exitCode=0 Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751203 2573 generic.go:358] "Generic (PLEG): container finished" podID="d19b1d29-4f03-425e-9146-821f748286fe" containerID="8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" exitCode=0 Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751208 2573 generic.go:358] "Generic (PLEG): container finished" podID="d19b1d29-4f03-425e-9146-821f748286fe" containerID="c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" exitCode=0 Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerDied","Data":"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954"} Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerDied","Data":"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd"} Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751291 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerDied","Data":"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b"} Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751291 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751300 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerDied","Data":"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87"} Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerDied","Data":"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25"} Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751321 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerDied","Data":"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c"} Apr 20 13:33:35.751326 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751330 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d19b1d29-4f03-425e-9146-821f748286fe","Type":"ContainerDied","Data":"cb697fae9e52a1d70219ea6853d049696c7b828e8be1c52faa533a44423e370b"} Apr 20 13:33:35.751720 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.751345 2573 scope.go:117] "RemoveContainer" containerID="bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" Apr 20 13:33:35.761739 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.761718 2573 scope.go:117] "RemoveContainer" containerID="369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" Apr 20 13:33:35.769034 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.769019 2573 scope.go:117] "RemoveContainer" containerID="7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" Apr 20 13:33:35.775043 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.775026 2573 scope.go:117] "RemoveContainer" containerID="f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" Apr 20 13:33:35.781248 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.781234 2573 scope.go:117] "RemoveContainer" containerID="8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" Apr 20 13:33:35.783133 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.783080 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 13:33:35.787853 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.787839 2573 scope.go:117] "RemoveContainer" containerID="c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" Apr 20 13:33:35.794244 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.794223 2573 scope.go:117] "RemoveContainer" containerID="cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98" Apr 20 13:33:35.797799 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.797781 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 13:33:35.800969 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.800954 2573 scope.go:117] "RemoveContainer" containerID="bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" Apr 20 13:33:35.801207 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:35.801189 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": container with ID starting with bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954 not found: ID does not exist" containerID="bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" Apr 20 13:33:35.801263 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.801216 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954"} err="failed to get container status \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": rpc error: code = NotFound desc = could not find container \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": container with ID starting with bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954 not found: ID does not exist" Apr 20 13:33:35.801263 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.801239 2573 scope.go:117] "RemoveContainer" containerID="369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" Apr 20 13:33:35.801457 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:35.801441 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": container with ID starting with 369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd not found: ID does not exist" containerID="369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" Apr 20 13:33:35.801495 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.801462 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd"} err="failed to get container status \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": rpc error: code = NotFound desc = could not find container \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": container with ID starting with 369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd not found: ID does not exist" Apr 20 13:33:35.801495 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.801477 2573 scope.go:117] "RemoveContainer" containerID="7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" Apr 20 13:33:35.801671 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:35.801652 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": container with ID starting with 7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b not found: ID does not exist" containerID="7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" Apr 20 13:33:35.801734 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.801680 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b"} err="failed to get container status \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": rpc error: code = NotFound desc = could not find container \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": container with ID starting with 7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b not found: ID does not exist" Apr 20 13:33:35.801734 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.801703 2573 scope.go:117] "RemoveContainer" containerID="f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" Apr 20 13:33:35.801966 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:35.801950 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": container with ID starting with f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87 not found: ID does not exist" containerID="f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" Apr 20 13:33:35.802004 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.801969 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87"} err="failed to get container status \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": rpc error: code = NotFound desc = could not find container \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": container with ID starting with f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87 not found: ID does not exist" Apr 20 13:33:35.802004 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.801982 2573 scope.go:117] "RemoveContainer" containerID="8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" Apr 20 13:33:35.802208 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:35.802192 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": container with ID starting with 8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25 not found: ID does not exist" containerID="8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" Apr 20 13:33:35.802263 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.802211 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25"} err="failed to get container status \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": rpc error: code = NotFound desc = could not find container \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": container with ID starting with 8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25 not found: ID does not exist" Apr 20 13:33:35.802263 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.802224 2573 scope.go:117] "RemoveContainer" containerID="c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" Apr 20 13:33:35.802422 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:35.802404 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": container with ID starting with c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c not found: ID does not exist" containerID="c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" Apr 20 13:33:35.802497 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.802430 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c"} err="failed to get container status \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": rpc error: code = NotFound desc = could not find container \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": container with ID starting with c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c not found: ID does not exist" Apr 20 13:33:35.802497 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.802449 2573 scope.go:117] "RemoveContainer" containerID="cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98" Apr 20 13:33:35.802635 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:33:35.802621 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": container with ID starting with cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98 not found: ID does not exist" containerID="cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98" Apr 20 13:33:35.802673 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.802638 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98"} err="failed to get container status \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": rpc error: code = NotFound desc = could not find container \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": container with ID starting with cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98 not found: ID does not exist" Apr 20 13:33:35.802673 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.802650 2573 scope.go:117] "RemoveContainer" containerID="bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" Apr 20 13:33:35.802923 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.802909 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954"} err="failed to get container status \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": rpc error: code = NotFound desc = could not find container \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": container with ID starting with bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954 not found: ID does not exist" Apr 20 13:33:35.802923 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.802923 2573 scope.go:117] "RemoveContainer" containerID="369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" Apr 20 13:33:35.803094 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803080 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd"} err="failed to get container status \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": rpc error: code = NotFound desc = could not find container \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": container with ID starting with 369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd not found: ID does not exist" Apr 20 13:33:35.803139 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803093 2573 scope.go:117] "RemoveContainer" containerID="7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" Apr 20 13:33:35.803256 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803240 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b"} err="failed to get container status \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": rpc error: code = NotFound desc = could not find container \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": container with ID starting with 7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b not found: ID does not exist" Apr 20 13:33:35.803297 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803256 2573 scope.go:117] "RemoveContainer" containerID="f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" Apr 20 13:33:35.803439 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803421 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87"} err="failed to get container status \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": rpc error: code = NotFound desc = could not find container \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": container with ID starting with f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87 not found: ID does not exist" Apr 20 13:33:35.803509 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803441 2573 scope.go:117] "RemoveContainer" containerID="8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" Apr 20 13:33:35.803631 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803614 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25"} err="failed to get container status \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": rpc error: code = NotFound desc = could not find container \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": container with ID starting with 8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25 not found: ID does not exist" Apr 20 13:33:35.803631 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803629 2573 scope.go:117] "RemoveContainer" containerID="c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" Apr 20 13:33:35.803802 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803783 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c"} err="failed to get container status \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": rpc error: code = NotFound desc = could not find container \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": container with ID starting with c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c not found: ID does not exist" Apr 20 13:33:35.803861 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.803805 2573 scope.go:117] "RemoveContainer" containerID="cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98" Apr 20 13:33:35.804018 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804001 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98"} err="failed to get container status \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": rpc error: code = NotFound desc = could not find container \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": container with ID starting with cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98 not found: ID does not exist" Apr 20 13:33:35.804109 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804020 2573 scope.go:117] "RemoveContainer" containerID="bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" Apr 20 13:33:35.804242 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804220 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954"} err="failed to get container status \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": rpc error: code = NotFound desc = could not find container \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": container with ID starting with bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954 not found: ID does not exist" Apr 20 13:33:35.804288 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804242 2573 scope.go:117] "RemoveContainer" containerID="369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" Apr 20 13:33:35.804412 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804398 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd"} err="failed to get container status \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": rpc error: code = NotFound desc = could not find container \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": container with ID starting with 369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd not found: ID does not exist" Apr 20 13:33:35.804460 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804412 2573 scope.go:117] "RemoveContainer" containerID="7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" Apr 20 13:33:35.804582 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804568 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b"} err="failed to get container status \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": rpc error: code = NotFound desc = could not find container \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": container with ID starting with 7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b not found: ID does not exist" Apr 20 13:33:35.804634 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804584 2573 scope.go:117] "RemoveContainer" containerID="f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" Apr 20 13:33:35.804759 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804726 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87"} err="failed to get container status \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": rpc error: code = NotFound desc = could not find container \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": container with ID starting with f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87 not found: ID does not exist" Apr 20 13:33:35.804832 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804760 2573 scope.go:117] "RemoveContainer" containerID="8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" Apr 20 13:33:35.804928 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804911 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25"} err="failed to get container status \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": rpc error: code = NotFound desc = could not find container \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": container with ID starting with 8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25 not found: ID does not exist" Apr 20 13:33:35.804990 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.804929 2573 scope.go:117] "RemoveContainer" containerID="c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" Apr 20 13:33:35.805080 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805060 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c"} err="failed to get container status \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": rpc error: code = NotFound desc = could not find container \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": container with ID starting with c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c not found: ID does not exist" Apr 20 13:33:35.805133 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805080 2573 scope.go:117] "RemoveContainer" containerID="cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98" Apr 20 13:33:35.805273 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805258 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98"} err="failed to get container status \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": rpc error: code = NotFound desc = could not find container \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": container with ID starting with cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98 not found: ID does not exist" Apr 20 13:33:35.805336 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805275 2573 scope.go:117] "RemoveContainer" containerID="bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" Apr 20 13:33:35.805453 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805439 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954"} err="failed to get container status \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": rpc error: code = NotFound desc = could not find container \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": container with ID starting with bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954 not found: ID does not exist" Apr 20 13:33:35.805500 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805453 2573 scope.go:117] "RemoveContainer" containerID="369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" Apr 20 13:33:35.805605 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805587 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd"} err="failed to get container status \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": rpc error: code = NotFound desc = could not find container \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": container with ID starting with 369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd not found: ID does not exist" Apr 20 13:33:35.805665 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805606 2573 scope.go:117] "RemoveContainer" containerID="7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" Apr 20 13:33:35.805808 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805791 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b"} err="failed to get container status \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": rpc error: code = NotFound desc = could not find container \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": container with ID starting with 7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b not found: ID does not exist" Apr 20 13:33:35.805867 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805809 2573 scope.go:117] "RemoveContainer" containerID="f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" Apr 20 13:33:35.805994 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805978 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87"} err="failed to get container status \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": rpc error: code = NotFound desc = could not find container \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": container with ID starting with f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87 not found: ID does not exist" Apr 20 13:33:35.806045 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.805996 2573 scope.go:117] "RemoveContainer" containerID="8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" Apr 20 13:33:35.806177 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.806159 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25"} err="failed to get container status \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": rpc error: code = NotFound desc = could not find container \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": container with ID starting with 8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25 not found: ID does not exist" Apr 20 13:33:35.806221 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.806178 2573 scope.go:117] "RemoveContainer" containerID="c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" Apr 20 13:33:35.806398 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.806373 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c"} err="failed to get container status \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": rpc error: code = NotFound desc = could not find container \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": container with ID starting with c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c not found: ID does not exist" Apr 20 13:33:35.806463 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.806399 2573 scope.go:117] "RemoveContainer" containerID="cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98" Apr 20 13:33:35.806592 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.806576 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98"} err="failed to get container status \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": rpc error: code = NotFound desc = could not find container \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": container with ID starting with cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98 not found: ID does not exist" Apr 20 13:33:35.806654 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.806593 2573 scope.go:117] "RemoveContainer" containerID="bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" Apr 20 13:33:35.806823 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.806803 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954"} err="failed to get container status \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": rpc error: code = NotFound desc = could not find container \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": container with ID starting with bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954 not found: ID does not exist" Apr 20 13:33:35.806878 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.806824 2573 scope.go:117] "RemoveContainer" containerID="369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" Apr 20 13:33:35.807049 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807031 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd"} err="failed to get container status \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": rpc error: code = NotFound desc = could not find container \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": container with ID starting with 369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd not found: ID does not exist" Apr 20 13:33:35.807091 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807050 2573 scope.go:117] "RemoveContainer" containerID="7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" Apr 20 13:33:35.807266 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807249 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b"} err="failed to get container status \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": rpc error: code = NotFound desc = could not find container \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": container with ID starting with 7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b not found: ID does not exist" Apr 20 13:33:35.807316 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807268 2573 scope.go:117] "RemoveContainer" containerID="f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" Apr 20 13:33:35.807478 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807462 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87"} err="failed to get container status \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": rpc error: code = NotFound desc = could not find container \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": container with ID starting with f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87 not found: ID does not exist" Apr 20 13:33:35.807527 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807478 2573 scope.go:117] "RemoveContainer" containerID="8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" Apr 20 13:33:35.807668 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807654 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25"} err="failed to get container status \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": rpc error: code = NotFound desc = could not find container \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": container with ID starting with 8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25 not found: ID does not exist" Apr 20 13:33:35.807722 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807668 2573 scope.go:117] "RemoveContainer" containerID="c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" Apr 20 13:33:35.807886 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807869 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c"} err="failed to get container status \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": rpc error: code = NotFound desc = could not find container \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": container with ID starting with c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c not found: ID does not exist" Apr 20 13:33:35.807886 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.807886 2573 scope.go:117] "RemoveContainer" containerID="cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98" Apr 20 13:33:35.808042 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808026 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98"} err="failed to get container status \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": rpc error: code = NotFound desc = could not find container \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": container with ID starting with cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98 not found: ID does not exist" Apr 20 13:33:35.808082 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808042 2573 scope.go:117] "RemoveContainer" containerID="bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954" Apr 20 13:33:35.808185 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808170 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954"} err="failed to get container status \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": rpc error: code = NotFound desc = could not find container \"bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954\": container with ID starting with bb15bd62dccd5639738dbb3e8dc2f57f1ab13bf94d32c1d8ba1e1b174d88c954 not found: ID does not exist" Apr 20 13:33:35.808230 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808185 2573 scope.go:117] "RemoveContainer" containerID="369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd" Apr 20 13:33:35.808377 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808361 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd"} err="failed to get container status \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": rpc error: code = NotFound desc = could not find container \"369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd\": container with ID starting with 369ec71a32d74d7f31f73b0959fcf609e4eea511c5fe6c0aa825d286bca47dfd not found: ID does not exist" Apr 20 13:33:35.808418 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808377 2573 scope.go:117] "RemoveContainer" containerID="7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b" Apr 20 13:33:35.808557 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808541 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b"} err="failed to get container status \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": rpc error: code = NotFound desc = could not find container \"7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b\": container with ID starting with 7b501e895bb53dbab5a799f4a558679ed2eb7f0868ad466b6ad7f906945c670b not found: ID does not exist" Apr 20 13:33:35.808596 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808557 2573 scope.go:117] "RemoveContainer" containerID="f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87" Apr 20 13:33:35.808769 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808725 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87"} err="failed to get container status \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": rpc error: code = NotFound desc = could not find container \"f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87\": container with ID starting with f3b5572b06d77ffbe9004b6eb004e1e4f1a71d60ca7e02a3082c2a74b301ec87 not found: ID does not exist" Apr 20 13:33:35.808769 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808766 2573 scope.go:117] "RemoveContainer" containerID="8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25" Apr 20 13:33:35.808999 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808984 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25"} err="failed to get container status \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": rpc error: code = NotFound desc = could not find container \"8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25\": container with ID starting with 8eccaa9ef3b5e1da546bba41d2765b0098907a7b3cdbda0e53ec54edfb6dbb25 not found: ID does not exist" Apr 20 13:33:35.808999 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.808999 2573 scope.go:117] "RemoveContainer" containerID="c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c" Apr 20 13:33:35.809203 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.809190 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c"} err="failed to get container status \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": rpc error: code = NotFound desc = could not find container \"c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c\": container with ID starting with c85f76d3fb956efa37c8241e3e1413769938d1e37e8e16f5b8a26cb7dc49a12c not found: ID does not exist" Apr 20 13:33:35.809246 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.809203 2573 scope.go:117] "RemoveContainer" containerID="cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98" Apr 20 13:33:35.809405 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.809388 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98"} err="failed to get container status \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": rpc error: code = NotFound desc = could not find container \"cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98\": container with ID starting with cfe9103f6381a2d3d15bdcf0233ec32e46c0611416499579398e5c8a792a8e98 not found: ID does not exist" Apr 20 13:33:35.850640 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.850608 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 13:33:35.851101 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851085 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy-thanos" Apr 20 13:33:35.851180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851105 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy-thanos" Apr 20 13:33:35.851180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851123 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="thanos-sidecar" Apr 20 13:33:35.851180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851131 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="thanos-sidecar" Apr 20 13:33:35.851180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851148 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="config-reloader" Apr 20 13:33:35.851180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851157 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="config-reloader" Apr 20 13:33:35.851180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851169 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="init-config-reloader" Apr 20 13:33:35.851180 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851178 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="init-config-reloader" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851194 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="prometheus" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851203 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="prometheus" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851211 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy-web" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851219 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy-web" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851229 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851237 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851298 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy-web" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851312 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy-thanos" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851324 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="config-reloader" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851334 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="thanos-sidecar" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851345 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="kube-rbac-proxy" Apr 20 13:33:35.851438 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.851355 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d19b1d29-4f03-425e-9146-821f748286fe" containerName="prometheus" Apr 20 13:33:35.857089 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.857064 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.860879 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.860853 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 13:33:35.861103 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.860930 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-h9tlz\"" Apr 20 13:33:35.861225 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.861208 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 13:33:35.861225 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.861217 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 13:33:35.861368 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.861323 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 13:33:35.862057 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.862035 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 13:33:35.866277 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.866256 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 13:33:35.866394 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.866296 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 13:33:35.867166 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.867118 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 13:33:35.867166 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.867146 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-fobai5o6bovg8\"" Apr 20 13:33:35.867568 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.867428 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 13:33:35.867568 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.867470 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 13:33:35.868735 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.868716 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 13:33:35.869980 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.869958 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 13:33:35.883569 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.883542 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 13:33:35.953650 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.953650 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953609 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-web-config\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.953650 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953629 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/426ca5ef-3a9a-4267-98bc-b9112b05e56f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwvdv\" (UniqueName: \"kubernetes.io/projected/426ca5ef-3a9a-4267-98bc-b9112b05e56f-kube-api-access-jwvdv\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/426ca5ef-3a9a-4267-98bc-b9112b05e56f-config-out\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953952 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.953983 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.954003 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.954026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/426ca5ef-3a9a-4267-98bc-b9112b05e56f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954048 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.954050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954431 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.954069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:35.954431 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:35.954091 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-config\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-config\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055664 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-web-config\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/426ca5ef-3a9a-4267-98bc-b9112b05e56f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.055972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.056002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwvdv\" (UniqueName: \"kubernetes.io/projected/426ca5ef-3a9a-4267-98bc-b9112b05e56f-kube-api-access-jwvdv\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.056045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/426ca5ef-3a9a-4267-98bc-b9112b05e56f-config-out\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.056074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.056116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.056151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.060712 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.056189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.061673 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.056226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/426ca5ef-3a9a-4267-98bc-b9112b05e56f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.061673 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.059028 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/426ca5ef-3a9a-4267-98bc-b9112b05e56f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.061673 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.061346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/426ca5ef-3a9a-4267-98bc-b9112b05e56f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.061673 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.061471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.061673 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.061471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.062981 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.062671 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.063105 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.063036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-config\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.063402 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.063374 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.063719 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.063687 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.063847 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.063740 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.064076 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.064057 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.064179 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.064125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.064638 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.064614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.065447 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.065403 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/426ca5ef-3a9a-4267-98bc-b9112b05e56f-config-out\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.065923 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.065900 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.066029 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.066011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-web-config\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.066434 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.066410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/426ca5ef-3a9a-4267-98bc-b9112b05e56f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.066854 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.066818 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/426ca5ef-3a9a-4267-98bc-b9112b05e56f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.078419 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.078401 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwvdv\" (UniqueName: \"kubernetes.io/projected/426ca5ef-3a9a-4267-98bc-b9112b05e56f-kube-api-access-jwvdv\") pod \"prometheus-k8s-0\" (UID: \"426ca5ef-3a9a-4267-98bc-b9112b05e56f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.168917 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.168876 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:33:36.316272 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.316242 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 13:33:36.319421 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:33:36.319396 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod426ca5ef_3a9a_4267_98bc_b9112b05e56f.slice/crio-51bd43b8f0554b3fb77c012cf0d7fd203b87e5da299758f73159950e24c5c156 WatchSource:0}: Error finding container 51bd43b8f0554b3fb77c012cf0d7fd203b87e5da299758f73159950e24c5c156: Status 404 returned error can't find the container with id 51bd43b8f0554b3fb77c012cf0d7fd203b87e5da299758f73159950e24c5c156 Apr 20 13:33:36.756056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.756018 2573 generic.go:358] "Generic (PLEG): container finished" podID="426ca5ef-3a9a-4267-98bc-b9112b05e56f" containerID="e19dba077e549c719aced29aa0d281c6e479e664dc0ed0b8170e2fccf7bc0916" exitCode=0 Apr 20 13:33:36.756245 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.756119 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"426ca5ef-3a9a-4267-98bc-b9112b05e56f","Type":"ContainerDied","Data":"e19dba077e549c719aced29aa0d281c6e479e664dc0ed0b8170e2fccf7bc0916"} Apr 20 13:33:36.756245 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:36.756160 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"426ca5ef-3a9a-4267-98bc-b9112b05e56f","Type":"ContainerStarted","Data":"51bd43b8f0554b3fb77c012cf0d7fd203b87e5da299758f73159950e24c5c156"} Apr 20 13:33:37.135206 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.135183 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19b1d29-4f03-425e-9146-821f748286fe" path="/var/lib/kubelet/pods/d19b1d29-4f03-425e-9146-821f748286fe/volumes" Apr 20 13:33:37.763068 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.763034 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"426ca5ef-3a9a-4267-98bc-b9112b05e56f","Type":"ContainerStarted","Data":"59bf96eddcca39451d3852be94c638ded4259b8ee211009493ff9e2b1847be56"} Apr 20 13:33:37.763068 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.763071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"426ca5ef-3a9a-4267-98bc-b9112b05e56f","Type":"ContainerStarted","Data":"90e4d7726bcd610a34a9b8c5bcfcf84489771ce552dcdd8aa89ec7ae01d40da3"} Apr 20 13:33:37.763301 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.763081 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"426ca5ef-3a9a-4267-98bc-b9112b05e56f","Type":"ContainerStarted","Data":"69ac07c89e6b97c928005891c13042020e003c3b6ebbf02a842a00d3c042f2b2"} Apr 20 13:33:37.763301 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.763092 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"426ca5ef-3a9a-4267-98bc-b9112b05e56f","Type":"ContainerStarted","Data":"8b52354adc21ae7c4e4db3029fdcff39be0fbfc368c5ceabb8039d8a3a7f40a8"} Apr 20 13:33:37.763301 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.763104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"426ca5ef-3a9a-4267-98bc-b9112b05e56f","Type":"ContainerStarted","Data":"8ad54f118016e881325cbfa4b4b5aef55dd746bed448d14742c87d97a4648877"} Apr 20 13:33:37.763301 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.763116 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"426ca5ef-3a9a-4267-98bc-b9112b05e56f","Type":"ContainerStarted","Data":"6d84bebff518bc927addb72db4ad0213880d6cbdba6e4b834fdb87edaca51f0b"} Apr 20 13:33:37.764837 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.764812 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" event={"ID":"eece39e6-7f69-43d9-9ef3-312fa1419532","Type":"ContainerStarted","Data":"fbe803c2349246b7a4c44c88d56a1533d94c33e2b022dd1deb73a2b40ece8537"} Apr 20 13:33:37.764837 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.764839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" event={"ID":"eece39e6-7f69-43d9-9ef3-312fa1419532","Type":"ContainerStarted","Data":"05e1f0b6bb87dc44aca5ebf8c4621fb4ad713c4e4730b92f273de0a8c479b4a7"} Apr 20 13:33:37.765028 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.764850 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" event={"ID":"eece39e6-7f69-43d9-9ef3-312fa1419532","Type":"ContainerStarted","Data":"060b343ab9222a85478d29e913b26031a2b42827c12e6fb9462f536315fd41f4"} Apr 20 13:33:37.810644 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.810552 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.810533204 podStartE2EDuration="2.810533204s" podCreationTimestamp="2026-04-20 13:33:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:33:37.809107251 +0000 UTC m=+143.339033626" watchObservedRunningTime="2026-04-20 13:33:37.810533204 +0000 UTC m=+143.340459584" Apr 20 13:33:37.839166 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:37.839119 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6d64487d4b-lmdwc" podStartSLOduration=1.743599021 podStartE2EDuration="3.839105053s" podCreationTimestamp="2026-04-20 13:33:34 +0000 UTC" firstStartedPulling="2026-04-20 13:33:34.901564426 +0000 UTC m=+140.431490789" lastFinishedPulling="2026-04-20 13:33:36.997070463 +0000 UTC m=+142.526996821" observedRunningTime="2026-04-20 13:33:37.836673856 +0000 UTC m=+143.366600236" watchObservedRunningTime="2026-04-20 13:33:37.839105053 +0000 UTC m=+143.369031433" Apr 20 13:33:41.169436 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:33:41.169400 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:34:36.169806 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:34:36.169767 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:34:36.185849 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:34:36.185825 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:34:36.956289 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:34:36.956264 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 13:35:39.378739 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.378664 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gq6hv"] Apr 20 13:35:39.380950 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.380935 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.383339 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.383319 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 13:35:39.389206 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.389183 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gq6hv"] Apr 20 13:35:39.476369 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.476315 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e67a4318-4409-4e7d-9c39-57001252f5e2-kubelet-config\") pod \"global-pull-secret-syncer-gq6hv\" (UID: \"e67a4318-4409-4e7d-9c39-57001252f5e2\") " pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.476546 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.476397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e67a4318-4409-4e7d-9c39-57001252f5e2-dbus\") pod \"global-pull-secret-syncer-gq6hv\" (UID: \"e67a4318-4409-4e7d-9c39-57001252f5e2\") " pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.476546 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.476467 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e67a4318-4409-4e7d-9c39-57001252f5e2-original-pull-secret\") pod \"global-pull-secret-syncer-gq6hv\" (UID: \"e67a4318-4409-4e7d-9c39-57001252f5e2\") " pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.577587 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.577557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e67a4318-4409-4e7d-9c39-57001252f5e2-dbus\") pod \"global-pull-secret-syncer-gq6hv\" (UID: \"e67a4318-4409-4e7d-9c39-57001252f5e2\") " pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.577789 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.577603 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e67a4318-4409-4e7d-9c39-57001252f5e2-original-pull-secret\") pod \"global-pull-secret-syncer-gq6hv\" (UID: \"e67a4318-4409-4e7d-9c39-57001252f5e2\") " pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.577789 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.577664 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e67a4318-4409-4e7d-9c39-57001252f5e2-kubelet-config\") pod \"global-pull-secret-syncer-gq6hv\" (UID: \"e67a4318-4409-4e7d-9c39-57001252f5e2\") " pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.577789 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.577737 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e67a4318-4409-4e7d-9c39-57001252f5e2-dbus\") pod \"global-pull-secret-syncer-gq6hv\" (UID: \"e67a4318-4409-4e7d-9c39-57001252f5e2\") " pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.577789 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.577786 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e67a4318-4409-4e7d-9c39-57001252f5e2-kubelet-config\") pod \"global-pull-secret-syncer-gq6hv\" (UID: \"e67a4318-4409-4e7d-9c39-57001252f5e2\") " pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.579868 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.579848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e67a4318-4409-4e7d-9c39-57001252f5e2-original-pull-secret\") pod \"global-pull-secret-syncer-gq6hv\" (UID: \"e67a4318-4409-4e7d-9c39-57001252f5e2\") " pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.689664 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.689593 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gq6hv" Apr 20 13:35:39.805794 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:39.805771 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gq6hv"] Apr 20 13:35:39.808048 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:35:39.808021 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode67a4318_4409_4e7d_9c39_57001252f5e2.slice/crio-d68e96b0a30fc6c7f63ca88f7037765d5d871070211e9fcafa5b158599c0708e WatchSource:0}: Error finding container d68e96b0a30fc6c7f63ca88f7037765d5d871070211e9fcafa5b158599c0708e: Status 404 returned error can't find the container with id d68e96b0a30fc6c7f63ca88f7037765d5d871070211e9fcafa5b158599c0708e Apr 20 13:35:40.120415 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:40.120380 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gq6hv" event={"ID":"e67a4318-4409-4e7d-9c39-57001252f5e2","Type":"ContainerStarted","Data":"d68e96b0a30fc6c7f63ca88f7037765d5d871070211e9fcafa5b158599c0708e"} Apr 20 13:35:44.133445 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:44.133408 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gq6hv" event={"ID":"e67a4318-4409-4e7d-9c39-57001252f5e2","Type":"ContainerStarted","Data":"cf8d14f6f74b8a011f033e8d49f84100fbe2a87481003268a02426856028f115"} Apr 20 13:35:44.151805 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:35:44.151738 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gq6hv" podStartSLOduration=1.48434993 podStartE2EDuration="5.151723404s" podCreationTimestamp="2026-04-20 13:35:39 +0000 UTC" firstStartedPulling="2026-04-20 13:35:39.809687037 +0000 UTC m=+265.339613394" lastFinishedPulling="2026-04-20 13:35:43.477060506 +0000 UTC m=+269.006986868" observedRunningTime="2026-04-20 13:35:44.150198292 +0000 UTC m=+269.680124673" watchObservedRunningTime="2026-04-20 13:35:44.151723404 +0000 UTC m=+269.681649784" Apr 20 13:36:15.022690 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:15.022660 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:36:15.023247 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:15.022846 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:36:40.626786 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.626736 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-pfkfw"] Apr 20 13:36:40.628958 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.628944 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:40.631528 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.631508 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 13:36:40.631629 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.631608 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 13:36:40.632397 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.632380 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-bpn2d\"" Apr 20 13:36:40.637856 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.637836 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-pfkfw"] Apr 20 13:36:40.688974 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.688947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/207b428a-bb8f-4172-b9fb-e7fa509c1561-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-pfkfw\" (UID: \"207b428a-bb8f-4172-b9fb-e7fa509c1561\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:40.689080 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.688985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mctd5\" (UniqueName: \"kubernetes.io/projected/207b428a-bb8f-4172-b9fb-e7fa509c1561-kube-api-access-mctd5\") pod \"cert-manager-webhook-597b96b99b-pfkfw\" (UID: \"207b428a-bb8f-4172-b9fb-e7fa509c1561\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:40.789704 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.789670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/207b428a-bb8f-4172-b9fb-e7fa509c1561-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-pfkfw\" (UID: \"207b428a-bb8f-4172-b9fb-e7fa509c1561\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:40.789894 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.789742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mctd5\" (UniqueName: \"kubernetes.io/projected/207b428a-bb8f-4172-b9fb-e7fa509c1561-kube-api-access-mctd5\") pod \"cert-manager-webhook-597b96b99b-pfkfw\" (UID: \"207b428a-bb8f-4172-b9fb-e7fa509c1561\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:40.799729 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.799700 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/207b428a-bb8f-4172-b9fb-e7fa509c1561-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-pfkfw\" (UID: \"207b428a-bb8f-4172-b9fb-e7fa509c1561\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:40.799895 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.799878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mctd5\" (UniqueName: \"kubernetes.io/projected/207b428a-bb8f-4172-b9fb-e7fa509c1561-kube-api-access-mctd5\") pod \"cert-manager-webhook-597b96b99b-pfkfw\" (UID: \"207b428a-bb8f-4172-b9fb-e7fa509c1561\") " pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:40.951175 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:40.951085 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:41.071655 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:41.071629 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-pfkfw"] Apr 20 13:36:41.074030 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:36:41.073998 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod207b428a_bb8f_4172_b9fb_e7fa509c1561.slice/crio-c7dc6c7f6f665b3c1d3d6bdd470c4259ea8fc3c65e1d09f6c2861998ccaab1c7 WatchSource:0}: Error finding container c7dc6c7f6f665b3c1d3d6bdd470c4259ea8fc3c65e1d09f6c2861998ccaab1c7: Status 404 returned error can't find the container with id c7dc6c7f6f665b3c1d3d6bdd470c4259ea8fc3c65e1d09f6c2861998ccaab1c7 Apr 20 13:36:41.075863 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:41.075847 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 13:36:41.301028 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:41.300996 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" event={"ID":"207b428a-bb8f-4172-b9fb-e7fa509c1561","Type":"ContainerStarted","Data":"c7dc6c7f6f665b3c1d3d6bdd470c4259ea8fc3c65e1d09f6c2861998ccaab1c7"} Apr 20 13:36:45.317955 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:45.317921 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" event={"ID":"207b428a-bb8f-4172-b9fb-e7fa509c1561","Type":"ContainerStarted","Data":"6d370474fd415a536f4a9b56ee84c8bdc66d13f2f045964b3f0b32eb679b8372"} Apr 20 13:36:45.318354 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:45.317999 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:45.350237 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:45.350186 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" podStartSLOduration=2.03092769 podStartE2EDuration="5.350171993s" podCreationTimestamp="2026-04-20 13:36:40 +0000 UTC" firstStartedPulling="2026-04-20 13:36:41.07597269 +0000 UTC m=+326.605899052" lastFinishedPulling="2026-04-20 13:36:44.395216997 +0000 UTC m=+329.925143355" observedRunningTime="2026-04-20 13:36:45.347615241 +0000 UTC m=+330.877541623" watchObservedRunningTime="2026-04-20 13:36:45.350171993 +0000 UTC m=+330.880098372" Apr 20 13:36:51.322651 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:51.322620 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-pfkfw" Apr 20 13:36:59.055394 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.055309 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-p7gbp"] Apr 20 13:36:59.058565 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.058541 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-p7gbp" Apr 20 13:36:59.061349 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.061326 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-z8x48\"" Apr 20 13:36:59.067898 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.067872 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-p7gbp"] Apr 20 13:36:59.147796 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.147768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5qw7\" (UniqueName: \"kubernetes.io/projected/374be54b-d905-43fd-b669-bb771bc49051-kube-api-access-c5qw7\") pod \"cert-manager-759f64656b-p7gbp\" (UID: \"374be54b-d905-43fd-b669-bb771bc49051\") " pod="cert-manager/cert-manager-759f64656b-p7gbp" Apr 20 13:36:59.147947 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.147839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/374be54b-d905-43fd-b669-bb771bc49051-bound-sa-token\") pod \"cert-manager-759f64656b-p7gbp\" (UID: \"374be54b-d905-43fd-b669-bb771bc49051\") " pod="cert-manager/cert-manager-759f64656b-p7gbp" Apr 20 13:36:59.248846 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.248812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/374be54b-d905-43fd-b669-bb771bc49051-bound-sa-token\") pod \"cert-manager-759f64656b-p7gbp\" (UID: \"374be54b-d905-43fd-b669-bb771bc49051\") " pod="cert-manager/cert-manager-759f64656b-p7gbp" Apr 20 13:36:59.249001 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.248869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5qw7\" (UniqueName: \"kubernetes.io/projected/374be54b-d905-43fd-b669-bb771bc49051-kube-api-access-c5qw7\") pod \"cert-manager-759f64656b-p7gbp\" (UID: \"374be54b-d905-43fd-b669-bb771bc49051\") " pod="cert-manager/cert-manager-759f64656b-p7gbp" Apr 20 13:36:59.257606 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.257575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/374be54b-d905-43fd-b669-bb771bc49051-bound-sa-token\") pod \"cert-manager-759f64656b-p7gbp\" (UID: \"374be54b-d905-43fd-b669-bb771bc49051\") " pod="cert-manager/cert-manager-759f64656b-p7gbp" Apr 20 13:36:59.257727 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.257711 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5qw7\" (UniqueName: \"kubernetes.io/projected/374be54b-d905-43fd-b669-bb771bc49051-kube-api-access-c5qw7\") pod \"cert-manager-759f64656b-p7gbp\" (UID: \"374be54b-d905-43fd-b669-bb771bc49051\") " pod="cert-manager/cert-manager-759f64656b-p7gbp" Apr 20 13:36:59.368810 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.368718 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-p7gbp" Apr 20 13:36:59.483654 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:36:59.483474 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-p7gbp"] Apr 20 13:36:59.486333 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:36:59.486306 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod374be54b_d905_43fd_b669_bb771bc49051.slice/crio-21c0413c1d44f93a6705aa9ec5c2820d4253d6d4dbe74c379d85e9813acc9c16 WatchSource:0}: Error finding container 21c0413c1d44f93a6705aa9ec5c2820d4253d6d4dbe74c379d85e9813acc9c16: Status 404 returned error can't find the container with id 21c0413c1d44f93a6705aa9ec5c2820d4253d6d4dbe74c379d85e9813acc9c16 Apr 20 13:37:00.360271 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:00.360237 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-p7gbp" event={"ID":"374be54b-d905-43fd-b669-bb771bc49051","Type":"ContainerStarted","Data":"50599b29a70c50bbfa262e41c747fc57849aa9eab7e24ec6df569423b1cec4b3"} Apr 20 13:37:00.360271 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:00.360276 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-p7gbp" event={"ID":"374be54b-d905-43fd-b669-bb771bc49051","Type":"ContainerStarted","Data":"21c0413c1d44f93a6705aa9ec5c2820d4253d6d4dbe74c379d85e9813acc9c16"} Apr 20 13:37:00.378495 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:00.378447 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-p7gbp" podStartSLOduration=1.378433331 podStartE2EDuration="1.378433331s" podCreationTimestamp="2026-04-20 13:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:37:00.376272208 +0000 UTC m=+345.906198618" watchObservedRunningTime="2026-04-20 13:37:00.378433331 +0000 UTC m=+345.908359693" Apr 20 13:37:10.309899 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.309867 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559"] Apr 20 13:37:10.313197 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.313177 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.315395 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.315374 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 13:37:10.315586 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.315569 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 13:37:10.315653 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.315615 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 13:37:10.315717 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.315682 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 13:37:10.315960 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.315945 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-zmf5m\"" Apr 20 13:37:10.333162 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.333136 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559"] Apr 20 13:37:10.444378 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.444344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/230f3455-f9a1-4983-ac3e-e9043f1649be-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-l8559\" (UID: \"230f3455-f9a1-4983-ac3e-e9043f1649be\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.444586 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.444411 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/230f3455-f9a1-4983-ac3e-e9043f1649be-webhook-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-l8559\" (UID: \"230f3455-f9a1-4983-ac3e-e9043f1649be\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.444586 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.444457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhxll\" (UniqueName: \"kubernetes.io/projected/230f3455-f9a1-4983-ac3e-e9043f1649be-kube-api-access-dhxll\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-l8559\" (UID: \"230f3455-f9a1-4983-ac3e-e9043f1649be\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.545287 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.545251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/230f3455-f9a1-4983-ac3e-e9043f1649be-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-l8559\" (UID: \"230f3455-f9a1-4983-ac3e-e9043f1649be\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.545444 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.545299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/230f3455-f9a1-4983-ac3e-e9043f1649be-webhook-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-l8559\" (UID: \"230f3455-f9a1-4983-ac3e-e9043f1649be\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.545444 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.545326 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhxll\" (UniqueName: \"kubernetes.io/projected/230f3455-f9a1-4983-ac3e-e9043f1649be-kube-api-access-dhxll\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-l8559\" (UID: \"230f3455-f9a1-4983-ac3e-e9043f1649be\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.547729 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.547701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/230f3455-f9a1-4983-ac3e-e9043f1649be-webhook-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-l8559\" (UID: \"230f3455-f9a1-4983-ac3e-e9043f1649be\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.547859 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.547772 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/230f3455-f9a1-4983-ac3e-e9043f1649be-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-l8559\" (UID: \"230f3455-f9a1-4983-ac3e-e9043f1649be\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.572082 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.572025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhxll\" (UniqueName: \"kubernetes.io/projected/230f3455-f9a1-4983-ac3e-e9043f1649be-kube-api-access-dhxll\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-l8559\" (UID: \"230f3455-f9a1-4983-ac3e-e9043f1649be\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.623937 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.623906 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:10.750591 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:10.750495 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559"] Apr 20 13:37:10.753042 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:37:10.753011 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod230f3455_f9a1_4983_ac3e_e9043f1649be.slice/crio-545d3089cfc597d56404f21677ce64cb1f0948b649dcf01b5f111a2115144660 WatchSource:0}: Error finding container 545d3089cfc597d56404f21677ce64cb1f0948b649dcf01b5f111a2115144660: Status 404 returned error can't find the container with id 545d3089cfc597d56404f21677ce64cb1f0948b649dcf01b5f111a2115144660 Apr 20 13:37:11.395768 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:11.395700 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" event={"ID":"230f3455-f9a1-4983-ac3e-e9043f1649be","Type":"ContainerStarted","Data":"545d3089cfc597d56404f21677ce64cb1f0948b649dcf01b5f111a2115144660"} Apr 20 13:37:13.404803 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:13.404740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" event={"ID":"230f3455-f9a1-4983-ac3e-e9043f1649be","Type":"ContainerStarted","Data":"57d0122068f536b99126c8badec964a7602cc8218ffff87eff89571c1c3d228e"} Apr 20 13:37:13.405265 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:13.404881 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:13.425723 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:13.425675 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" podStartSLOduration=0.917725125 podStartE2EDuration="3.425660246s" podCreationTimestamp="2026-04-20 13:37:10 +0000 UTC" firstStartedPulling="2026-04-20 13:37:10.754790507 +0000 UTC m=+356.284716868" lastFinishedPulling="2026-04-20 13:37:13.262725618 +0000 UTC m=+358.792651989" observedRunningTime="2026-04-20 13:37:13.423762719 +0000 UTC m=+358.953689090" watchObservedRunningTime="2026-04-20 13:37:13.425660246 +0000 UTC m=+358.955586625" Apr 20 13:37:24.409527 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:24.409496 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-l8559" Apr 20 13:37:25.049165 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.049127 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v"] Apr 20 13:37:25.052542 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.052526 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.056098 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.056069 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 13:37:25.056336 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.056316 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:37:25.056941 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.056925 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 13:37:25.057115 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.056931 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 13:37:25.057207 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.056983 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 13:37:25.057280 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.057026 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-qfg5c\"" Apr 20 13:37:25.063808 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.063786 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v"] Apr 20 13:37:25.064952 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.064930 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-cert\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.065102 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.065088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-metrics-cert\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.065250 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.065237 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8jnr\" (UniqueName: \"kubernetes.io/projected/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-kube-api-access-d8jnr\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.065367 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.065353 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-manager-config\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.166426 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.166388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-cert\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.166615 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.166434 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-metrics-cert\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.166615 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.166482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8jnr\" (UniqueName: \"kubernetes.io/projected/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-kube-api-access-d8jnr\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.166615 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.166515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-manager-config\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.167467 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.167435 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-manager-config\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.169264 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.169244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-metrics-cert\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.169437 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.169416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-cert\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.175088 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.175063 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8jnr\" (UniqueName: \"kubernetes.io/projected/570d9d9f-d844-4a4c-ba11-e19afebd1fd1-kube-api-access-d8jnr\") pod \"lws-controller-manager-59c6b8cc85-hll6v\" (UID: \"570d9d9f-d844-4a4c-ba11-e19afebd1fd1\") " pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.366047 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.365972 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:25.489722 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:25.489695 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v"] Apr 20 13:37:25.492775 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:37:25.492728 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod570d9d9f_d844_4a4c_ba11_e19afebd1fd1.slice/crio-90e1324626bdae1752ce13945af92edcd8211e24847e0c035ac233c97748b27b WatchSource:0}: Error finding container 90e1324626bdae1752ce13945af92edcd8211e24847e0c035ac233c97748b27b: Status 404 returned error can't find the container with id 90e1324626bdae1752ce13945af92edcd8211e24847e0c035ac233c97748b27b Apr 20 13:37:26.447208 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:26.447172 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" event={"ID":"570d9d9f-d844-4a4c-ba11-e19afebd1fd1","Type":"ContainerStarted","Data":"90e1324626bdae1752ce13945af92edcd8211e24847e0c035ac233c97748b27b"} Apr 20 13:37:29.460020 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:29.459982 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" event={"ID":"570d9d9f-d844-4a4c-ba11-e19afebd1fd1","Type":"ContainerStarted","Data":"575024589c5cf0a8c1077b3fea2f3de0db87f446bd3737cfc75b5c30acebaba7"} Apr 20 13:37:29.460454 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:29.460093 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:29.478664 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:29.478611 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" podStartSLOduration=1.399135955 podStartE2EDuration="4.478595391s" podCreationTimestamp="2026-04-20 13:37:25 +0000 UTC" firstStartedPulling="2026-04-20 13:37:25.495054229 +0000 UTC m=+371.024980587" lastFinishedPulling="2026-04-20 13:37:28.574513664 +0000 UTC m=+374.104440023" observedRunningTime="2026-04-20 13:37:29.477934674 +0000 UTC m=+375.007861064" watchObservedRunningTime="2026-04-20 13:37:29.478595391 +0000 UTC m=+375.008521776" Apr 20 13:37:39.511984 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.511953 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-54c65669-bwfbg"] Apr 20 13:37:39.516222 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.516195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.519576 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.519550 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 13:37:39.519902 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.519683 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 13:37:39.520056 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.520035 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 13:37:39.520473 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.520450 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 13:37:39.522126 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.522107 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-957bg\"" Apr 20 13:37:39.529585 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.529562 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-54c65669-bwfbg"] Apr 20 13:37:39.596000 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.595970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcdwq\" (UniqueName: \"kubernetes.io/projected/1ee2c78f-e61b-4c6a-b5e3-0b238256afb8-kube-api-access-pcdwq\") pod \"kube-auth-proxy-54c65669-bwfbg\" (UID: \"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8\") " pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.596188 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.596074 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ee2c78f-e61b-4c6a-b5e3-0b238256afb8-tls-certs\") pod \"kube-auth-proxy-54c65669-bwfbg\" (UID: \"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8\") " pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.596188 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.596120 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ee2c78f-e61b-4c6a-b5e3-0b238256afb8-tmp\") pod \"kube-auth-proxy-54c65669-bwfbg\" (UID: \"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8\") " pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.697324 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.697287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ee2c78f-e61b-4c6a-b5e3-0b238256afb8-tls-certs\") pod \"kube-auth-proxy-54c65669-bwfbg\" (UID: \"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8\") " pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.697532 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.697349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ee2c78f-e61b-4c6a-b5e3-0b238256afb8-tmp\") pod \"kube-auth-proxy-54c65669-bwfbg\" (UID: \"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8\") " pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.697532 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.697428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcdwq\" (UniqueName: \"kubernetes.io/projected/1ee2c78f-e61b-4c6a-b5e3-0b238256afb8-kube-api-access-pcdwq\") pod \"kube-auth-proxy-54c65669-bwfbg\" (UID: \"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8\") " pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.699576 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.699553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ee2c78f-e61b-4c6a-b5e3-0b238256afb8-tmp\") pod \"kube-auth-proxy-54c65669-bwfbg\" (UID: \"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8\") " pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.699811 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.699794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ee2c78f-e61b-4c6a-b5e3-0b238256afb8-tls-certs\") pod \"kube-auth-proxy-54c65669-bwfbg\" (UID: \"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8\") " pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.706708 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.706686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcdwq\" (UniqueName: \"kubernetes.io/projected/1ee2c78f-e61b-4c6a-b5e3-0b238256afb8-kube-api-access-pcdwq\") pod \"kube-auth-proxy-54c65669-bwfbg\" (UID: \"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8\") " pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.830157 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.830062 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" Apr 20 13:37:39.951643 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:39.951618 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-54c65669-bwfbg"] Apr 20 13:37:39.953884 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:37:39.953856 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ee2c78f_e61b_4c6a_b5e3_0b238256afb8.slice/crio-9dbe5f7a33353039b764b67387a688cbaabef9e4fa1eeffe99aaf2350b7029d6 WatchSource:0}: Error finding container 9dbe5f7a33353039b764b67387a688cbaabef9e4fa1eeffe99aaf2350b7029d6: Status 404 returned error can't find the container with id 9dbe5f7a33353039b764b67387a688cbaabef9e4fa1eeffe99aaf2350b7029d6 Apr 20 13:37:40.465620 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:40.465586 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-59c6b8cc85-hll6v" Apr 20 13:37:40.503718 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:40.503681 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" event={"ID":"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8","Type":"ContainerStarted","Data":"9dbe5f7a33353039b764b67387a688cbaabef9e4fa1eeffe99aaf2350b7029d6"} Apr 20 13:37:43.515886 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:43.515854 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" event={"ID":"1ee2c78f-e61b-4c6a-b5e3-0b238256afb8","Type":"ContainerStarted","Data":"311394c5308285e90438540ecdbb005379930a4532c8dfaeaaafbc2cb35f4112"} Apr 20 13:37:43.533801 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:37:43.533735 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-54c65669-bwfbg" podStartSLOduration=1.331452179 podStartE2EDuration="4.533721666s" podCreationTimestamp="2026-04-20 13:37:39 +0000 UTC" firstStartedPulling="2026-04-20 13:37:39.955724619 +0000 UTC m=+385.485650977" lastFinishedPulling="2026-04-20 13:37:43.157994102 +0000 UTC m=+388.687920464" observedRunningTime="2026-04-20 13:37:43.532734659 +0000 UTC m=+389.062661040" watchObservedRunningTime="2026-04-20 13:37:43.533721666 +0000 UTC m=+389.063648046" Apr 20 13:39:12.233913 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.233868 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9"] Apr 20 13:39:12.239781 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.239479 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.242110 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.242082 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 13:39:12.242225 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.242115 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 13:39:12.242279 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.242249 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 13:39:12.243093 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.243079 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 13:39:12.243160 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.243105 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-bbcr5\"" Apr 20 13:39:12.245721 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.245703 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9"] Apr 20 13:39:12.295865 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.295839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/558807db-4efc-4c0f-843b-a01dde2697a8-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.295971 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.295876 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2c8d\" (UniqueName: \"kubernetes.io/projected/558807db-4efc-4c0f-843b-a01dde2697a8-kube-api-access-x2c8d\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.295971 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.295910 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/558807db-4efc-4c0f-843b-a01dde2697a8-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.396644 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.396606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/558807db-4efc-4c0f-843b-a01dde2697a8-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.396644 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.396644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2c8d\" (UniqueName: \"kubernetes.io/projected/558807db-4efc-4c0f-843b-a01dde2697a8-kube-api-access-x2c8d\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.396893 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.396673 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/558807db-4efc-4c0f-843b-a01dde2697a8-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.396893 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:39:12.396786 2573 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 20 13:39:12.396893 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:39:12.396868 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/558807db-4efc-4c0f-843b-a01dde2697a8-plugin-serving-cert podName:558807db-4efc-4c0f-843b-a01dde2697a8 nodeName:}" failed. No retries permitted until 2026-04-20 13:39:12.896847721 +0000 UTC m=+478.426774092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/558807db-4efc-4c0f-843b-a01dde2697a8-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-8s6t9" (UID: "558807db-4efc-4c0f-843b-a01dde2697a8") : secret "plugin-serving-cert" not found Apr 20 13:39:12.397291 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.397272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/558807db-4efc-4c0f-843b-a01dde2697a8-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.404477 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.404451 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2c8d\" (UniqueName: \"kubernetes.io/projected/558807db-4efc-4c0f-843b-a01dde2697a8-kube-api-access-x2c8d\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.901517 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.901487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/558807db-4efc-4c0f-843b-a01dde2697a8-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:12.903950 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:12.903920 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/558807db-4efc-4c0f-843b-a01dde2697a8-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-8s6t9\" (UID: \"558807db-4efc-4c0f-843b-a01dde2697a8\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:13.150611 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:13.150578 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" Apr 20 13:39:13.274085 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:13.274063 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9"] Apr 20 13:39:13.276894 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:39:13.276865 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod558807db_4efc_4c0f_843b_a01dde2697a8.slice/crio-e3e75b51e2d682734cb7fe80f1493a8cea3731a5e3ecf97d22e5bee8169f6e5b WatchSource:0}: Error finding container e3e75b51e2d682734cb7fe80f1493a8cea3731a5e3ecf97d22e5bee8169f6e5b: Status 404 returned error can't find the container with id e3e75b51e2d682734cb7fe80f1493a8cea3731a5e3ecf97d22e5bee8169f6e5b Apr 20 13:39:13.816844 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:13.816796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" event={"ID":"558807db-4efc-4c0f-843b-a01dde2697a8","Type":"ContainerStarted","Data":"e3e75b51e2d682734cb7fe80f1493a8cea3731a5e3ecf97d22e5bee8169f6e5b"} Apr 20 13:39:36.905042 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:36.905001 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" event={"ID":"558807db-4efc-4c0f-843b-a01dde2697a8","Type":"ContainerStarted","Data":"402fdbe076d93abff4af6fe6105f63928995a71f2a177ac4040c4d6e8c195fbb"} Apr 20 13:39:36.921693 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:36.921636 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-8s6t9" podStartSLOduration=1.523254871 podStartE2EDuration="24.921618422s" podCreationTimestamp="2026-04-20 13:39:12 +0000 UTC" firstStartedPulling="2026-04-20 13:39:13.278219525 +0000 UTC m=+478.808145886" lastFinishedPulling="2026-04-20 13:39:36.676583076 +0000 UTC m=+502.206509437" observedRunningTime="2026-04-20 13:39:36.921025339 +0000 UTC m=+502.450951722" watchObservedRunningTime="2026-04-20 13:39:36.921618422 +0000 UTC m=+502.451544803" Apr 20 13:39:56.952238 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:56.952203 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:39:57.001099 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.001066 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:39:57.001099 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.001095 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:39:57.001288 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.001191 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:39:57.004068 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.004044 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 13:39:57.099717 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.099678 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/059e2fad-2cbe-4f6a-b0ad-c0cc4209770e-config-file\") pod \"limitador-limitador-78c99df468-mdvfc\" (UID: \"059e2fad-2cbe-4f6a-b0ad-c0cc4209770e\") " pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:39:57.099907 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.099740 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kjhl\" (UniqueName: \"kubernetes.io/projected/059e2fad-2cbe-4f6a-b0ad-c0cc4209770e-kube-api-access-4kjhl\") pod \"limitador-limitador-78c99df468-mdvfc\" (UID: \"059e2fad-2cbe-4f6a-b0ad-c0cc4209770e\") " pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:39:57.200932 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.200892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/059e2fad-2cbe-4f6a-b0ad-c0cc4209770e-config-file\") pod \"limitador-limitador-78c99df468-mdvfc\" (UID: \"059e2fad-2cbe-4f6a-b0ad-c0cc4209770e\") " pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:39:57.201117 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.200997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kjhl\" (UniqueName: \"kubernetes.io/projected/059e2fad-2cbe-4f6a-b0ad-c0cc4209770e-kube-api-access-4kjhl\") pod \"limitador-limitador-78c99df468-mdvfc\" (UID: \"059e2fad-2cbe-4f6a-b0ad-c0cc4209770e\") " pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:39:57.201647 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.201622 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/059e2fad-2cbe-4f6a-b0ad-c0cc4209770e-config-file\") pod \"limitador-limitador-78c99df468-mdvfc\" (UID: \"059e2fad-2cbe-4f6a-b0ad-c0cc4209770e\") " pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:39:57.209083 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.209025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kjhl\" (UniqueName: \"kubernetes.io/projected/059e2fad-2cbe-4f6a-b0ad-c0cc4209770e-kube-api-access-4kjhl\") pod \"limitador-limitador-78c99df468-mdvfc\" (UID: \"059e2fad-2cbe-4f6a-b0ad-c0cc4209770e\") " pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:39:57.311805 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.311770 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:39:57.436811 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.436781 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:39:57.440084 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:39:57.440043 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059e2fad_2cbe_4f6a_b0ad_c0cc4209770e.slice/crio-927c086c7db24c51506d9248103b7bf63f030e3ad8b39c9618abf083fe337182 WatchSource:0}: Error finding container 927c086c7db24c51506d9248103b7bf63f030e3ad8b39c9618abf083fe337182: Status 404 returned error can't find the container with id 927c086c7db24c51506d9248103b7bf63f030e3ad8b39c9618abf083fe337182 Apr 20 13:39:57.982174 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:39:57.982130 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" event={"ID":"059e2fad-2cbe-4f6a-b0ad-c0cc4209770e","Type":"ContainerStarted","Data":"927c086c7db24c51506d9248103b7bf63f030e3ad8b39c9618abf083fe337182"} Apr 20 13:40:00.997468 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:40:00.997432 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" event={"ID":"059e2fad-2cbe-4f6a-b0ad-c0cc4209770e","Type":"ContainerStarted","Data":"a4757fa68cec360dd9c6fb3d08a2eb2132c9f5f2beb5bc09f8865f1ae3d5b6b5"} Apr 20 13:40:00.997856 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:40:00.997653 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:40:01.022065 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:40:01.022022 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" podStartSLOduration=1.837770759 podStartE2EDuration="5.022007521s" podCreationTimestamp="2026-04-20 13:39:56 +0000 UTC" firstStartedPulling="2026-04-20 13:39:57.441980159 +0000 UTC m=+522.971906516" lastFinishedPulling="2026-04-20 13:40:00.626216903 +0000 UTC m=+526.156143278" observedRunningTime="2026-04-20 13:40:01.019588559 +0000 UTC m=+526.549514938" watchObservedRunningTime="2026-04-20 13:40:01.022007521 +0000 UTC m=+526.551933900" Apr 20 13:40:12.002205 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:40:12.002179 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-mdvfc" Apr 20 13:40:38.619113 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:40:38.619079 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:41:15.046507 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:41:15.046477 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:41:15.047280 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:41:15.047262 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:41:15.645508 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:41:15.645473 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:41:19.241938 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:41:19.241898 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:41:48.151695 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:41:48.151658 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:42:01.548008 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:42:01.547967 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:42:15.947076 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:42:15.947039 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:43:05.743334 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:43:05.743300 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:43:09.847838 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:43:09.847806 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:43:16.146657 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:43:16.146621 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:43:26.137487 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:43:26.137453 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:43:36.065161 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:43:36.065126 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:43:46.252627 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:43:46.252590 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:43:54.844583 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:43:54.844547 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:44:04.947708 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:44:04.947671 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:45:00.140627 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.140589 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611545-r92q7"] Apr 20 13:45:00.144141 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.144120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" Apr 20 13:45:00.146371 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.146349 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-hn6q8\"" Apr 20 13:45:00.149922 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.149902 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611545-r92q7"] Apr 20 13:45:00.254657 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.254603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t45c\" (UniqueName: \"kubernetes.io/projected/0ce2a2cf-0b05-4b14-a191-133442476eb5-kube-api-access-8t45c\") pod \"maas-api-key-cleanup-29611545-r92q7\" (UID: \"0ce2a2cf-0b05-4b14-a191-133442476eb5\") " pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" Apr 20 13:45:00.355622 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.355577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t45c\" (UniqueName: \"kubernetes.io/projected/0ce2a2cf-0b05-4b14-a191-133442476eb5-kube-api-access-8t45c\") pod \"maas-api-key-cleanup-29611545-r92q7\" (UID: \"0ce2a2cf-0b05-4b14-a191-133442476eb5\") " pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" Apr 20 13:45:00.364594 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.364569 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t45c\" (UniqueName: \"kubernetes.io/projected/0ce2a2cf-0b05-4b14-a191-133442476eb5-kube-api-access-8t45c\") pod \"maas-api-key-cleanup-29611545-r92q7\" (UID: \"0ce2a2cf-0b05-4b14-a191-133442476eb5\") " pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" Apr 20 13:45:00.454815 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.454703 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" Apr 20 13:45:00.581406 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.581379 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611545-r92q7"] Apr 20 13:45:00.583507 ip-10-0-133-1 kubenswrapper[2573]: W0420 13:45:00.583479 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ce2a2cf_0b05_4b14_a191_133442476eb5.slice/crio-51d366254be674ca82b05e08e2fafc45048469e904f9e69fed35516c4aa390d6 WatchSource:0}: Error finding container 51d366254be674ca82b05e08e2fafc45048469e904f9e69fed35516c4aa390d6: Status 404 returned error can't find the container with id 51d366254be674ca82b05e08e2fafc45048469e904f9e69fed35516c4aa390d6 Apr 20 13:45:00.585629 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:00.585610 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 13:45:01.020321 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:01.020286 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" event={"ID":"0ce2a2cf-0b05-4b14-a191-133442476eb5","Type":"ContainerStarted","Data":"51d366254be674ca82b05e08e2fafc45048469e904f9e69fed35516c4aa390d6"} Apr 20 13:45:04.033788 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:04.033744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" event={"ID":"0ce2a2cf-0b05-4b14-a191-133442476eb5","Type":"ContainerStarted","Data":"7a92e3aa7a8203766698e976c8b957b92e8c95746ae0fa2cd89740e806c91a00"} Apr 20 13:45:04.051596 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:04.051549 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" podStartSLOduration=1.337103822 podStartE2EDuration="4.05153503s" podCreationTimestamp="2026-04-20 13:45:00 +0000 UTC" firstStartedPulling="2026-04-20 13:45:00.585781051 +0000 UTC m=+826.115707409" lastFinishedPulling="2026-04-20 13:45:03.300212256 +0000 UTC m=+828.830138617" observedRunningTime="2026-04-20 13:45:04.050974405 +0000 UTC m=+829.580900785" watchObservedRunningTime="2026-04-20 13:45:04.05153503 +0000 UTC m=+829.581461409" Apr 20 13:45:08.241261 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:08.241224 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:45:23.242519 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:23.242486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:45:24.109991 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:24.109954 2573 generic.go:358] "Generic (PLEG): container finished" podID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerID="7a92e3aa7a8203766698e976c8b957b92e8c95746ae0fa2cd89740e806c91a00" exitCode=6 Apr 20 13:45:24.110150 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:24.110014 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" event={"ID":"0ce2a2cf-0b05-4b14-a191-133442476eb5","Type":"ContainerDied","Data":"7a92e3aa7a8203766698e976c8b957b92e8c95746ae0fa2cd89740e806c91a00"} Apr 20 13:45:24.110360 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:24.110345 2573 scope.go:117] "RemoveContainer" containerID="7a92e3aa7a8203766698e976c8b957b92e8c95746ae0fa2cd89740e806c91a00" Apr 20 13:45:25.115173 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:25.115140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" event={"ID":"0ce2a2cf-0b05-4b14-a191-133442476eb5","Type":"ContainerStarted","Data":"51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475"} Apr 20 13:45:45.186904 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:45.186818 2573 generic.go:358] "Generic (PLEG): container finished" podID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerID="51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475" exitCode=6 Apr 20 13:45:45.187328 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:45.186896 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" event={"ID":"0ce2a2cf-0b05-4b14-a191-133442476eb5","Type":"ContainerDied","Data":"51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475"} Apr 20 13:45:45.187328 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:45.186943 2573 scope.go:117] "RemoveContainer" containerID="7a92e3aa7a8203766698e976c8b957b92e8c95746ae0fa2cd89740e806c91a00" Apr 20 13:45:45.187328 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:45.187294 2573 scope.go:117] "RemoveContainer" containerID="51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475" Apr 20 13:45:45.187525 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:45:45.187505 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611545-r92q7_opendatahub(0ce2a2cf-0b05-4b14-a191-133442476eb5)\"" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" Apr 20 13:45:57.130035 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:57.129964 2573 scope.go:117] "RemoveContainer" containerID="51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475" Apr 20 13:45:58.237102 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:58.237067 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" event={"ID":"0ce2a2cf-0b05-4b14-a191-133442476eb5","Type":"ContainerStarted","Data":"a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4"} Apr 20 13:45:59.266338 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:59.266308 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611545-r92q7"] Apr 20 13:45:59.266743 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:45:59.266515 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" containerID="cri-o://a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4" gracePeriod=30 Apr 20 13:46:02.554917 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:02.554885 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:46:15.070819 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:15.070782 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:46:15.073005 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:15.072979 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:46:17.906378 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:17.906353 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" Apr 20 13:46:17.937534 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:17.937503 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t45c\" (UniqueName: \"kubernetes.io/projected/0ce2a2cf-0b05-4b14-a191-133442476eb5-kube-api-access-8t45c\") pod \"0ce2a2cf-0b05-4b14-a191-133442476eb5\" (UID: \"0ce2a2cf-0b05-4b14-a191-133442476eb5\") " Apr 20 13:46:17.939587 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:17.939555 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce2a2cf-0b05-4b14-a191-133442476eb5-kube-api-access-8t45c" (OuterVolumeSpecName: "kube-api-access-8t45c") pod "0ce2a2cf-0b05-4b14-a191-133442476eb5" (UID: "0ce2a2cf-0b05-4b14-a191-133442476eb5"). InnerVolumeSpecName "kube-api-access-8t45c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:46:18.038540 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.038462 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8t45c\" (UniqueName: \"kubernetes.io/projected/0ce2a2cf-0b05-4b14-a191-133442476eb5-kube-api-access-8t45c\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 13:46:18.305008 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.304915 2573 generic.go:358] "Generic (PLEG): container finished" podID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerID="a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4" exitCode=6 Apr 20 13:46:18.305008 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.304982 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" Apr 20 13:46:18.305008 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.304998 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" event={"ID":"0ce2a2cf-0b05-4b14-a191-133442476eb5","Type":"ContainerDied","Data":"a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4"} Apr 20 13:46:18.305282 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.305041 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611545-r92q7" event={"ID":"0ce2a2cf-0b05-4b14-a191-133442476eb5","Type":"ContainerDied","Data":"51d366254be674ca82b05e08e2fafc45048469e904f9e69fed35516c4aa390d6"} Apr 20 13:46:18.305282 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.305059 2573 scope.go:117] "RemoveContainer" containerID="a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4" Apr 20 13:46:18.313339 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.313323 2573 scope.go:117] "RemoveContainer" containerID="51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475" Apr 20 13:46:18.320453 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.320437 2573 scope.go:117] "RemoveContainer" containerID="a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4" Apr 20 13:46:18.320701 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:46:18.320683 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4\": container with ID starting with a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4 not found: ID does not exist" containerID="a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4" Apr 20 13:46:18.320801 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.320709 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4"} err="failed to get container status \"a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4\": rpc error: code = NotFound desc = could not find container \"a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4\": container with ID starting with a6ad679392673efc69e4e6a3d55dcc56048ce82d09d0e46a9d0b8a99ed047ed4 not found: ID does not exist" Apr 20 13:46:18.320801 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.320726 2573 scope.go:117] "RemoveContainer" containerID="51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475" Apr 20 13:46:18.320953 ip-10-0-133-1 kubenswrapper[2573]: E0420 13:46:18.320939 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475\": container with ID starting with 51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475 not found: ID does not exist" containerID="51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475" Apr 20 13:46:18.320991 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.320956 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475"} err="failed to get container status \"51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475\": rpc error: code = NotFound desc = could not find container \"51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475\": container with ID starting with 51aec74f16a904824c43e8800657c4c4e9aa8cd7da6a28b2fc323bafb34a7475 not found: ID does not exist" Apr 20 13:46:18.324521 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.324500 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611545-r92q7"] Apr 20 13:46:18.326069 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:18.326048 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611545-r92q7"] Apr 20 13:46:19.134211 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:19.134179 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" path="/var/lib/kubelet/pods/0ce2a2cf-0b05-4b14-a191-133442476eb5/volumes" Apr 20 13:46:19.252902 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:19.252870 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:46:34.505183 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:34.505146 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:46:49.847876 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:46:49.847838 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:47:46.445432 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:47:46.445350 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:47:56.035297 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:47:56.035264 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:48:12.241558 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:48:12.241524 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:48:20.557444 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:48:20.557406 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:48:38.058363 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:48:38.058328 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:48:46.539993 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:48:46.539962 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:49:19.051280 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:49:19.051196 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:49:27.837766 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:49:27.837724 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:49:36.249106 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:49:36.249074 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:49:44.842978 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:49:44.842945 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:49:52.947233 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:49:52.947197 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:50:09.948023 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:50:09.947981 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:50:20.246593 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:50:20.246559 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:51:07.948876 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:51:07.948843 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:51:15.099332 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:51:15.099302 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:51:15.101442 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:51:15.101420 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:52:48.347991 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:52:48.347953 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:52:56.850060 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:52:56.850023 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:53:05.548337 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:53:05.548303 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:53:15.660298 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:53:15.660260 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:53:22.951045 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:53:22.951007 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:53:32.443243 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:53:32.443205 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:55:11.460855 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:55:11.460778 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:55:20.447232 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:55:20.447193 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:56:15.123248 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:56:15.123214 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:56:15.125245 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:56:15.125220 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 13:57:00.446884 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:57:00.446842 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:57:09.962041 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:57:09.962004 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:57:18.641903 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:57:18.641870 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:57:27.946910 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:57:27.946871 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:57:35.939446 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:57:35.939411 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:57:44.643247 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:57:44.643209 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:57:53.047841 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:57:53.047806 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:58:02.449570 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:58:02.449538 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 13:58:10.145533 ip-10-0-133-1 kubenswrapper[2573]: I0420 13:58:10.145499 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 14:00:00.140476 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.140436 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611560-zc5l9"] Apr 20 14:00:00.140964 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.140815 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" Apr 20 14:00:00.140964 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.140827 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" Apr 20 14:00:00.140964 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.140854 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" Apr 20 14:00:00.140964 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.140861 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" Apr 20 14:00:00.140964 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.140914 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" Apr 20 14:00:00.140964 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.140927 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" Apr 20 14:00:00.144054 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.144039 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" Apr 20 14:00:00.146363 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.146345 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-hn6q8\"" Apr 20 14:00:00.149794 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.149768 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611560-zc5l9"] Apr 20 14:00:00.213617 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.213584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spplr\" (UniqueName: \"kubernetes.io/projected/1749ca5c-c53b-49f3-9bdb-09505718fd16-kube-api-access-spplr\") pod \"maas-api-key-cleanup-29611560-zc5l9\" (UID: \"1749ca5c-c53b-49f3-9bdb-09505718fd16\") " pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" Apr 20 14:00:00.313997 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.313959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spplr\" (UniqueName: \"kubernetes.io/projected/1749ca5c-c53b-49f3-9bdb-09505718fd16-kube-api-access-spplr\") pod \"maas-api-key-cleanup-29611560-zc5l9\" (UID: \"1749ca5c-c53b-49f3-9bdb-09505718fd16\") " pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" Apr 20 14:00:00.322568 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.322542 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spplr\" (UniqueName: \"kubernetes.io/projected/1749ca5c-c53b-49f3-9bdb-09505718fd16-kube-api-access-spplr\") pod \"maas-api-key-cleanup-29611560-zc5l9\" (UID: \"1749ca5c-c53b-49f3-9bdb-09505718fd16\") " pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" Apr 20 14:00:00.455187 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.455104 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" Apr 20 14:00:00.578811 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.578784 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611560-zc5l9"] Apr 20 14:00:00.580852 ip-10-0-133-1 kubenswrapper[2573]: W0420 14:00:00.580824 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1749ca5c_c53b_49f3_9bdb_09505718fd16.slice/crio-d75b06ef148a37b9c27c03374cef05590110ac4a83fa8e69f590d02e61454420 WatchSource:0}: Error finding container d75b06ef148a37b9c27c03374cef05590110ac4a83fa8e69f590d02e61454420: Status 404 returned error can't find the container with id d75b06ef148a37b9c27c03374cef05590110ac4a83fa8e69f590d02e61454420 Apr 20 14:00:00.582504 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:00.582479 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:00:01.088959 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:01.088860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" event={"ID":"1749ca5c-c53b-49f3-9bdb-09505718fd16","Type":"ContainerStarted","Data":"790b2de1ead17015f7ca7470789cc59a8f529f4510bd0ae79cc8c042c9d88d28"} Apr 20 14:00:01.088959 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:01.088902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" event={"ID":"1749ca5c-c53b-49f3-9bdb-09505718fd16","Type":"ContainerStarted","Data":"d75b06ef148a37b9c27c03374cef05590110ac4a83fa8e69f590d02e61454420"} Apr 20 14:00:01.105149 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:01.105088 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" podStartSLOduration=1.105068298 podStartE2EDuration="1.105068298s" podCreationTimestamp="2026-04-20 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:00:01.103862828 +0000 UTC m=+1726.633789209" watchObservedRunningTime="2026-04-20 14:00:01.105068298 +0000 UTC m=+1726.634994678" Apr 20 14:00:22.165480 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:22.165444 2573 generic.go:358] "Generic (PLEG): container finished" podID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerID="790b2de1ead17015f7ca7470789cc59a8f529f4510bd0ae79cc8c042c9d88d28" exitCode=6 Apr 20 14:00:22.165855 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:22.165517 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" event={"ID":"1749ca5c-c53b-49f3-9bdb-09505718fd16","Type":"ContainerDied","Data":"790b2de1ead17015f7ca7470789cc59a8f529f4510bd0ae79cc8c042c9d88d28"} Apr 20 14:00:22.165916 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:22.165855 2573 scope.go:117] "RemoveContainer" containerID="790b2de1ead17015f7ca7470789cc59a8f529f4510bd0ae79cc8c042c9d88d28" Apr 20 14:00:23.170457 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:23.170420 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" event={"ID":"1749ca5c-c53b-49f3-9bdb-09505718fd16","Type":"ContainerStarted","Data":"6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd"} Apr 20 14:00:30.650230 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:30.650191 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 14:00:35.951956 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:35.951918 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 14:00:43.235807 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:43.235777 2573 generic.go:358] "Generic (PLEG): container finished" podID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerID="6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd" exitCode=6 Apr 20 14:00:43.236220 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:43.235837 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" event={"ID":"1749ca5c-c53b-49f3-9bdb-09505718fd16","Type":"ContainerDied","Data":"6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd"} Apr 20 14:00:43.236220 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:43.235867 2573 scope.go:117] "RemoveContainer" containerID="790b2de1ead17015f7ca7470789cc59a8f529f4510bd0ae79cc8c042c9d88d28" Apr 20 14:00:43.236220 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:43.236179 2573 scope.go:117] "RemoveContainer" containerID="6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd" Apr 20 14:00:43.236415 ip-10-0-133-1 kubenswrapper[2573]: E0420 14:00:43.236394 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611560-zc5l9_opendatahub(1749ca5c-c53b-49f3-9bdb-09505718fd16)\"" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" Apr 20 14:00:57.130418 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:57.130343 2573 scope.go:117] "RemoveContainer" containerID="6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd" Apr 20 14:00:58.295684 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:58.295647 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" event={"ID":"1749ca5c-c53b-49f3-9bdb-09505718fd16","Type":"ContainerStarted","Data":"17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7"} Apr 20 14:00:59.320690 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:59.320655 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611560-zc5l9"] Apr 20 14:00:59.321133 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:00:59.320958 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" containerID="cri-o://17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7" gracePeriod=30 Apr 20 14:01:01.148123 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:01.148085 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 14:01:07.643537 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:07.643500 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 14:01:15.155157 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:15.155127 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 14:01:15.157892 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:15.157872 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 14:01:17.053099 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:17.053065 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 14:01:17.959811 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:17.959791 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" Apr 20 14:01:18.084665 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.084581 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spplr\" (UniqueName: \"kubernetes.io/projected/1749ca5c-c53b-49f3-9bdb-09505718fd16-kube-api-access-spplr\") pod \"1749ca5c-c53b-49f3-9bdb-09505718fd16\" (UID: \"1749ca5c-c53b-49f3-9bdb-09505718fd16\") " Apr 20 14:01:18.086693 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.086670 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1749ca5c-c53b-49f3-9bdb-09505718fd16-kube-api-access-spplr" (OuterVolumeSpecName: "kube-api-access-spplr") pod "1749ca5c-c53b-49f3-9bdb-09505718fd16" (UID: "1749ca5c-c53b-49f3-9bdb-09505718fd16"). InnerVolumeSpecName "kube-api-access-spplr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:01:18.186332 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.186296 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-spplr\" (UniqueName: \"kubernetes.io/projected/1749ca5c-c53b-49f3-9bdb-09505718fd16-kube-api-access-spplr\") on node \"ip-10-0-133-1.ec2.internal\" DevicePath \"\"" Apr 20 14:01:18.367000 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.366905 2573 generic.go:358] "Generic (PLEG): container finished" podID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerID="17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7" exitCode=6 Apr 20 14:01:18.367000 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.366978 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" Apr 20 14:01:18.367000 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.366983 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" event={"ID":"1749ca5c-c53b-49f3-9bdb-09505718fd16","Type":"ContainerDied","Data":"17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7"} Apr 20 14:01:18.367272 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.367017 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611560-zc5l9" event={"ID":"1749ca5c-c53b-49f3-9bdb-09505718fd16","Type":"ContainerDied","Data":"d75b06ef148a37b9c27c03374cef05590110ac4a83fa8e69f590d02e61454420"} Apr 20 14:01:18.367272 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.367032 2573 scope.go:117] "RemoveContainer" containerID="17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7" Apr 20 14:01:18.375509 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.375491 2573 scope.go:117] "RemoveContainer" containerID="6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd" Apr 20 14:01:18.382818 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.382800 2573 scope.go:117] "RemoveContainer" containerID="17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7" Apr 20 14:01:18.383060 ip-10-0-133-1 kubenswrapper[2573]: E0420 14:01:18.383040 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7\": container with ID starting with 17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7 not found: ID does not exist" containerID="17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7" Apr 20 14:01:18.383122 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.383068 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7"} err="failed to get container status \"17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7\": rpc error: code = NotFound desc = could not find container \"17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7\": container with ID starting with 17c0ad8e61cf71cf6e9d1b63c8166d082293b55f56f9c719b453a3e716f15dc7 not found: ID does not exist" Apr 20 14:01:18.383122 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.383087 2573 scope.go:117] "RemoveContainer" containerID="6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd" Apr 20 14:01:18.383300 ip-10-0-133-1 kubenswrapper[2573]: E0420 14:01:18.383283 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd\": container with ID starting with 6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd not found: ID does not exist" containerID="6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd" Apr 20 14:01:18.383341 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.383306 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd"} err="failed to get container status \"6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd\": rpc error: code = NotFound desc = could not find container \"6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd\": container with ID starting with 6cf0c7a5d4032731fb80b16b595cae0119bf29d26199ef792256ded1739cb0fd not found: ID does not exist" Apr 20 14:01:18.389069 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.389048 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611560-zc5l9"] Apr 20 14:01:18.392579 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:18.392557 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611560-zc5l9"] Apr 20 14:01:19.134533 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:19.134492 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" path="/var/lib/kubelet/pods/1749ca5c-c53b-49f3-9bdb-09505718fd16/volumes" Apr 20 14:01:26.847485 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:26.847450 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-mdvfc"] Apr 20 14:01:38.813173 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:38.813147 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8cd4c57cb-l8559_230f3455-f9a1-4983-ac3e-e9043f1649be/manager/0.log" Apr 20 14:01:40.785287 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:40.785261 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-8s6t9_558807db-4efc-4c0f-843b-a01dde2697a8/kuadrant-console-plugin/0.log" Apr 20 14:01:41.132467 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:41.132388 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-mdvfc_059e2fad-2cbe-4f6a-b0ad-c0cc4209770e/limitador/0.log" Apr 20 14:01:41.824943 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:41.824903 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-54c65669-bwfbg_1ee2c78f-e61b-4c6a-b5e3-0b238256afb8/kube-auth-proxy/0.log" Apr 20 14:01:47.445404 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445367 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7jq8s/must-gather-ppm9v"] Apr 20 14:01:47.445791 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445769 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" Apr 20 14:01:47.445791 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445785 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" Apr 20 14:01:47.445870 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445797 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" Apr 20 14:01:47.445870 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445803 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" Apr 20 14:01:47.445870 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445811 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" Apr 20 14:01:47.445870 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445817 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" Apr 20 14:01:47.445988 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445907 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" Apr 20 14:01:47.445988 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445918 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" Apr 20 14:01:47.445988 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445926 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" Apr 20 14:01:47.445988 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445932 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ce2a2cf-0b05-4b14-a191-133442476eb5" containerName="cleanup" Apr 20 14:01:47.446101 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.445999 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" Apr 20 14:01:47.446101 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.446005 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1749ca5c-c53b-49f3-9bdb-09505718fd16" containerName="cleanup" Apr 20 14:01:47.449119 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.449100 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7jq8s/must-gather-ppm9v" Apr 20 14:01:47.451365 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.451344 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7jq8s\"/\"openshift-service-ca.crt\"" Apr 20 14:01:47.451365 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.451362 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7jq8s\"/\"kube-root-ca.crt\"" Apr 20 14:01:47.452150 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.452133 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7jq8s\"/\"default-dockercfg-w22fr\"" Apr 20 14:01:47.456045 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.456020 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7jq8s/must-gather-ppm9v"] Apr 20 14:01:47.531896 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.531857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a3f97b3b-2ffa-4232-bbab-43bf76463bf3-must-gather-output\") pod \"must-gather-ppm9v\" (UID: \"a3f97b3b-2ffa-4232-bbab-43bf76463bf3\") " pod="openshift-must-gather-7jq8s/must-gather-ppm9v" Apr 20 14:01:47.532072 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.531909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vbv\" (UniqueName: \"kubernetes.io/projected/a3f97b3b-2ffa-4232-bbab-43bf76463bf3-kube-api-access-h2vbv\") pod \"must-gather-ppm9v\" (UID: \"a3f97b3b-2ffa-4232-bbab-43bf76463bf3\") " pod="openshift-must-gather-7jq8s/must-gather-ppm9v" Apr 20 14:01:47.632805 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.632744 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a3f97b3b-2ffa-4232-bbab-43bf76463bf3-must-gather-output\") pod \"must-gather-ppm9v\" (UID: \"a3f97b3b-2ffa-4232-bbab-43bf76463bf3\") " pod="openshift-must-gather-7jq8s/must-gather-ppm9v" Apr 20 14:01:47.632805 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.632813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vbv\" (UniqueName: \"kubernetes.io/projected/a3f97b3b-2ffa-4232-bbab-43bf76463bf3-kube-api-access-h2vbv\") pod \"must-gather-ppm9v\" (UID: \"a3f97b3b-2ffa-4232-bbab-43bf76463bf3\") " pod="openshift-must-gather-7jq8s/must-gather-ppm9v" Apr 20 14:01:47.633085 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.633066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a3f97b3b-2ffa-4232-bbab-43bf76463bf3-must-gather-output\") pod \"must-gather-ppm9v\" (UID: \"a3f97b3b-2ffa-4232-bbab-43bf76463bf3\") " pod="openshift-must-gather-7jq8s/must-gather-ppm9v" Apr 20 14:01:47.640815 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.640788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vbv\" (UniqueName: \"kubernetes.io/projected/a3f97b3b-2ffa-4232-bbab-43bf76463bf3-kube-api-access-h2vbv\") pod \"must-gather-ppm9v\" (UID: \"a3f97b3b-2ffa-4232-bbab-43bf76463bf3\") " pod="openshift-must-gather-7jq8s/must-gather-ppm9v" Apr 20 14:01:47.759150 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.759120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7jq8s/must-gather-ppm9v" Apr 20 14:01:47.874810 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:47.874739 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7jq8s/must-gather-ppm9v"] Apr 20 14:01:47.877694 ip-10-0-133-1 kubenswrapper[2573]: W0420 14:01:47.877658 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3f97b3b_2ffa_4232_bbab_43bf76463bf3.slice/crio-11ac7d1c42b25e578d1d1a9f084f9806d14ef09b11d9e5f703587601407b6dc5 WatchSource:0}: Error finding container 11ac7d1c42b25e578d1d1a9f084f9806d14ef09b11d9e5f703587601407b6dc5: Status 404 returned error can't find the container with id 11ac7d1c42b25e578d1d1a9f084f9806d14ef09b11d9e5f703587601407b6dc5 Apr 20 14:01:48.472929 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:48.472884 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7jq8s/must-gather-ppm9v" event={"ID":"a3f97b3b-2ffa-4232-bbab-43bf76463bf3","Type":"ContainerStarted","Data":"11ac7d1c42b25e578d1d1a9f084f9806d14ef09b11d9e5f703587601407b6dc5"} Apr 20 14:01:49.479884 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:49.479850 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7jq8s/must-gather-ppm9v" event={"ID":"a3f97b3b-2ffa-4232-bbab-43bf76463bf3","Type":"ContainerStarted","Data":"9d95867b7995d5da3f4fdb61ce1422df71e71f1b70db3aafbe6205843de74741"} Apr 20 14:01:49.479884 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:49.479889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7jq8s/must-gather-ppm9v" event={"ID":"a3f97b3b-2ffa-4232-bbab-43bf76463bf3","Type":"ContainerStarted","Data":"2b8b263c5ecd983c825e596b47198f076c20fcb37c726320212b0d2c4c0fd4d1"} Apr 20 14:01:49.502000 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:49.501939 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7jq8s/must-gather-ppm9v" podStartSLOduration=1.620164215 podStartE2EDuration="2.501920478s" podCreationTimestamp="2026-04-20 14:01:47 +0000 UTC" firstStartedPulling="2026-04-20 14:01:47.879523045 +0000 UTC m=+1833.409449404" lastFinishedPulling="2026-04-20 14:01:48.761279297 +0000 UTC m=+1834.291205667" observedRunningTime="2026-04-20 14:01:49.500798568 +0000 UTC m=+1835.030724946" watchObservedRunningTime="2026-04-20 14:01:49.501920478 +0000 UTC m=+1835.031846866" Apr 20 14:01:50.459472 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:50.459437 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gq6hv_e67a4318-4409-4e7d-9c39-57001252f5e2/global-pull-secret-syncer/0.log" Apr 20 14:01:50.611308 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:50.611280 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xnv4k_c062d4f4-2415-4685-915a-14cbd0991ab3/konnectivity-agent/0.log" Apr 20 14:01:50.656145 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:50.656110 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-1.ec2.internal_31008c883fac4d80f804d5217a8035e0/haproxy/0.log" Apr 20 14:01:55.065350 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:55.065272 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-8s6t9_558807db-4efc-4c0f-843b-a01dde2697a8/kuadrant-console-plugin/0.log" Apr 20 14:01:55.153554 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:55.153524 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-mdvfc_059e2fad-2cbe-4f6a-b0ad-c0cc4209770e/limitador/0.log" Apr 20 14:01:56.718536 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.718489 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ce3f9145-cf57-4f49-8343-3564aac75046/alertmanager/0.log" Apr 20 14:01:56.741453 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.741423 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ce3f9145-cf57-4f49-8343-3564aac75046/config-reloader/0.log" Apr 20 14:01:56.767815 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.767782 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ce3f9145-cf57-4f49-8343-3564aac75046/kube-rbac-proxy-web/0.log" Apr 20 14:01:56.790528 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.790501 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ce3f9145-cf57-4f49-8343-3564aac75046/kube-rbac-proxy/0.log" Apr 20 14:01:56.810624 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.810583 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ce3f9145-cf57-4f49-8343-3564aac75046/kube-rbac-proxy-metric/0.log" Apr 20 14:01:56.831134 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.831105 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ce3f9145-cf57-4f49-8343-3564aac75046/prom-label-proxy/0.log" Apr 20 14:01:56.857730 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.857695 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ce3f9145-cf57-4f49-8343-3564aac75046/init-config-reloader/0.log" Apr 20 14:01:56.922384 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.922336 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8dhct_3a522c83-7471-4e5d-be7f-5175e61ac4cd/kube-state-metrics/0.log" Apr 20 14:01:56.947853 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.947824 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8dhct_3a522c83-7471-4e5d-be7f-5175e61ac4cd/kube-rbac-proxy-main/0.log" Apr 20 14:01:56.973021 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.972911 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8dhct_3a522c83-7471-4e5d-be7f-5175e61ac4cd/kube-rbac-proxy-self/0.log" Apr 20 14:01:56.998699 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:56.998675 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-58fd84c859-2cb5z_3fddfc18-1911-4f8a-bc01-f13e1fee38da/metrics-server/0.log" Apr 20 14:01:57.027765 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.027722 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-cxm2d_c9b945b1-227f-44ca-b322-5d475bfba434/monitoring-plugin/0.log" Apr 20 14:01:57.138265 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.138190 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8hjt_0b5f3471-c214-4da6-8284-8a2e35239729/node-exporter/0.log" Apr 20 14:01:57.163583 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.163557 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8hjt_0b5f3471-c214-4da6-8284-8a2e35239729/kube-rbac-proxy/0.log" Apr 20 14:01:57.183690 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.183594 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8hjt_0b5f3471-c214-4da6-8284-8a2e35239729/init-textfile/0.log" Apr 20 14:01:57.283366 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.283337 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p2jvz_21e6288b-35f0-467f-85ec-0224e98f6ecf/kube-rbac-proxy-main/0.log" Apr 20 14:01:57.305275 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.305242 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p2jvz_21e6288b-35f0-467f-85ec-0224e98f6ecf/kube-rbac-proxy-self/0.log" Apr 20 14:01:57.329546 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.329515 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p2jvz_21e6288b-35f0-467f-85ec-0224e98f6ecf/openshift-state-metrics/0.log" Apr 20 14:01:57.363961 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.363925 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_426ca5ef-3a9a-4267-98bc-b9112b05e56f/prometheus/0.log" Apr 20 14:01:57.387551 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.387520 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_426ca5ef-3a9a-4267-98bc-b9112b05e56f/config-reloader/0.log" Apr 20 14:01:57.412693 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.412663 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_426ca5ef-3a9a-4267-98bc-b9112b05e56f/thanos-sidecar/0.log" Apr 20 14:01:57.432869 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.432839 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_426ca5ef-3a9a-4267-98bc-b9112b05e56f/kube-rbac-proxy-web/0.log" Apr 20 14:01:57.454794 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.454770 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_426ca5ef-3a9a-4267-98bc-b9112b05e56f/kube-rbac-proxy/0.log" Apr 20 14:01:57.474057 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.474029 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_426ca5ef-3a9a-4267-98bc-b9112b05e56f/kube-rbac-proxy-thanos/0.log" Apr 20 14:01:57.494061 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.494032 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_426ca5ef-3a9a-4267-98bc-b9112b05e56f/init-config-reloader/0.log" Apr 20 14:01:57.525054 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.525022 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-96m7g_fec29c32-4927-4343-91dc-5b24cc32dd2a/prometheus-operator/0.log" Apr 20 14:01:57.545110 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.545000 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-96m7g_fec29c32-4927-4343-91dc-5b24cc32dd2a/kube-rbac-proxy/0.log" Apr 20 14:01:57.572329 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.572248 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-7x7xq_77018601-43ed-4d16-b80f-22d590fbb6ea/prometheus-operator-admission-webhook/0.log" Apr 20 14:01:57.597372 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.597341 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d64487d4b-lmdwc_eece39e6-7f69-43d9-9ef3-312fa1419532/telemeter-client/0.log" Apr 20 14:01:57.616942 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.616895 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d64487d4b-lmdwc_eece39e6-7f69-43d9-9ef3-312fa1419532/reload/0.log" Apr 20 14:01:57.638036 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.637986 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d64487d4b-lmdwc_eece39e6-7f69-43d9-9ef3-312fa1419532/kube-rbac-proxy/0.log" Apr 20 14:01:57.668462 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.668424 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/thanos-query/0.log" Apr 20 14:01:57.686870 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.686838 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/kube-rbac-proxy-web/0.log" Apr 20 14:01:57.718236 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.718183 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/kube-rbac-proxy/0.log" Apr 20 14:01:57.740578 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.740552 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/prom-label-proxy/0.log" Apr 20 14:01:57.760145 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.760120 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/kube-rbac-proxy-rules/0.log" Apr 20 14:01:57.780020 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:57.779992 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b6688569-k52vh_60f42529-9f79-4b02-8616-ab3b14916104/kube-rbac-proxy-metrics/0.log" Apr 20 14:01:58.794930 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.794855 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg"] Apr 20 14:01:58.800102 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.800074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.808366 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.807793 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg"] Apr 20 14:01:58.861642 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.861605 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-sys\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.861848 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.861659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqk6t\" (UniqueName: \"kubernetes.io/projected/54b14501-2be7-4aed-adae-5448bc6d71fb-kube-api-access-xqk6t\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.861928 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.861836 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-lib-modules\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.861928 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.861897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-proc\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.862022 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.861932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-podres\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.962923 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.962889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-sys\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.963081 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.962946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqk6t\" (UniqueName: \"kubernetes.io/projected/54b14501-2be7-4aed-adae-5448bc6d71fb-kube-api-access-xqk6t\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.963081 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.963003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-lib-modules\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.963081 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.963007 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-sys\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.963081 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.963043 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-proc\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.963081 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.963073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-podres\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.963306 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.963176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-proc\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.963306 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.963191 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-podres\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.963306 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.963208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54b14501-2be7-4aed-adae-5448bc6d71fb-lib-modules\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:58.972066 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:58.972036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqk6t\" (UniqueName: \"kubernetes.io/projected/54b14501-2be7-4aed-adae-5448bc6d71fb-kube-api-access-xqk6t\") pod \"perf-node-gather-daemonset-rtrhg\" (UID: \"54b14501-2be7-4aed-adae-5448bc6d71fb\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:59.112684 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:59.112594 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:59.283429 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:59.283185 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg"] Apr 20 14:01:59.286584 ip-10-0-133-1 kubenswrapper[2573]: W0420 14:01:59.286556 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod54b14501_2be7_4aed_adae_5448bc6d71fb.slice/crio-1f443e3113ae7bfe5cdb7b2e46510a18bba2aa7f978ed2e74c01af0f19b06be4 WatchSource:0}: Error finding container 1f443e3113ae7bfe5cdb7b2e46510a18bba2aa7f978ed2e74c01af0f19b06be4: Status 404 returned error can't find the container with id 1f443e3113ae7bfe5cdb7b2e46510a18bba2aa7f978ed2e74c01af0f19b06be4 Apr 20 14:01:59.523378 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:59.523343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" event={"ID":"54b14501-2be7-4aed-adae-5448bc6d71fb","Type":"ContainerStarted","Data":"a2b68f56139a71dbe967f6108d823a0bdfa731101ce809f7fee57a44fcd86f08"} Apr 20 14:01:59.523378 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:59.523383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" event={"ID":"54b14501-2be7-4aed-adae-5448bc6d71fb","Type":"ContainerStarted","Data":"1f443e3113ae7bfe5cdb7b2e46510a18bba2aa7f978ed2e74c01af0f19b06be4"} Apr 20 14:01:59.523600 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:59.523415 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:01:59.542202 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:01:59.542144 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" podStartSLOduration=1.5421279970000001 podStartE2EDuration="1.542127997s" podCreationTimestamp="2026-04-20 14:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:01:59.539369641 +0000 UTC m=+1845.069296022" watchObservedRunningTime="2026-04-20 14:01:59.542127997 +0000 UTC m=+1845.072054378" Apr 20 14:02:01.288290 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:01.288232 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwvkk_b848630b-ad2b-4d47-be50-9df586a4911a/dns/0.log" Apr 20 14:02:01.310934 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:01.310907 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wwvkk_b848630b-ad2b-4d47-be50-9df586a4911a/kube-rbac-proxy/0.log" Apr 20 14:02:01.357172 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:01.357152 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-75q77_e0388411-4485-4a66-9511-1c06b60790d7/dns-node-resolver/0.log" Apr 20 14:02:01.890607 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:01.890563 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qvdjn_d282e4a7-f5fa-463a-a056-646bf858c554/node-ca/0.log" Apr 20 14:02:02.818250 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:02.818223 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-54c65669-bwfbg_1ee2c78f-e61b-4c6a-b5e3-0b238256afb8/kube-auth-proxy/0.log" Apr 20 14:02:03.400607 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:03.400553 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6cvrv_3909667c-16ab-4114-9adb-f5a8ef49a1fe/serve-healthcheck-canary/0.log" Apr 20 14:02:03.953568 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:03.953541 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l2cjz_64a3417b-0d23-4d64-bb08-acd796c20cc1/kube-rbac-proxy/0.log" Apr 20 14:02:03.972814 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:03.972792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l2cjz_64a3417b-0d23-4d64-bb08-acd796c20cc1/exporter/0.log" Apr 20 14:02:03.993412 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:03.993372 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l2cjz_64a3417b-0d23-4d64-bb08-acd796c20cc1/extractor/0.log" Apr 20 14:02:05.543680 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:05.543651 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-rtrhg" Apr 20 14:02:06.068620 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:06.068583 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8cd4c57cb-l8559_230f3455-f9a1-4983-ac3e-e9043f1649be/manager/0.log" Apr 20 14:02:07.324744 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:07.324698 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-59c6b8cc85-hll6v_570d9d9f-d844-4a4c-ba11-e19afebd1fd1/manager/0.log" Apr 20 14:02:13.093015 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.092983 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bwnwj_650b481f-9321-4709-a40c-b7e7ad6e6429/kube-multus-additional-cni-plugins/0.log" Apr 20 14:02:13.115261 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.115239 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bwnwj_650b481f-9321-4709-a40c-b7e7ad6e6429/egress-router-binary-copy/0.log" Apr 20 14:02:13.137134 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.137103 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bwnwj_650b481f-9321-4709-a40c-b7e7ad6e6429/cni-plugins/0.log" Apr 20 14:02:13.160872 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.160846 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bwnwj_650b481f-9321-4709-a40c-b7e7ad6e6429/bond-cni-plugin/0.log" Apr 20 14:02:13.184555 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.184533 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bwnwj_650b481f-9321-4709-a40c-b7e7ad6e6429/routeoverride-cni/0.log" Apr 20 14:02:13.206569 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.206538 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bwnwj_650b481f-9321-4709-a40c-b7e7ad6e6429/whereabouts-cni-bincopy/0.log" Apr 20 14:02:13.227772 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.227730 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bwnwj_650b481f-9321-4709-a40c-b7e7ad6e6429/whereabouts-cni/0.log" Apr 20 14:02:13.606447 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.606396 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s9jnj_d1caa7df-6d09-474b-b1e5-e18a510edd97/kube-multus/0.log" Apr 20 14:02:13.627680 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.627645 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-55n9j_e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35/network-metrics-daemon/0.log" Apr 20 14:02:13.646282 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:13.646257 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-55n9j_e5f40fcc-0e0b-4cc2-af17-3085b1f4bf35/kube-rbac-proxy/0.log" Apr 20 14:02:15.066900 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:15.066874 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-controller/0.log" Apr 20 14:02:15.081550 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:15.081508 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/0.log" Apr 20 14:02:15.098463 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:15.098437 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovn-acl-logging/1.log" Apr 20 14:02:15.114766 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:15.114721 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/kube-rbac-proxy-node/0.log" Apr 20 14:02:15.134956 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:15.134930 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 14:02:15.151613 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:15.151576 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/northd/0.log" Apr 20 14:02:15.169610 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:15.169591 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/nbdb/0.log" Apr 20 14:02:15.187948 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:15.187930 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/sbdb/0.log" Apr 20 14:02:15.369032 ip-10-0-133-1 kubenswrapper[2573]: I0420 14:02:15.368994 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qtnxb_884c5a2b-9d81-40ae-a58b-9b1298785f9b/ovnkube-controller/0.log"