Mar 18 16:41:20.555087 ip-10-0-135-173 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Mar 18 16:41:20.555099 ip-10-0-135-173 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Mar 18 16:41:20.555108 ip-10-0-135-173 systemd[1]: kubelet.service: Failed with result 'resources'. Mar 18 16:41:20.555341 ip-10-0-135-173 systemd[1]: Failed to start Kubernetes Kubelet. Mar 18 16:41:30.680297 ip-10-0-135-173 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Mar 18 16:41:30.680312 ip-10-0-135-173 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6f8bc6b1e8704a5a9616f99aee572aea -- Mar 18 16:43:50.012797 ip-10-0-135-173 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:43:50.453807 ip-10-0-135-173 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:50.453807 ip-10-0-135-173 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:43:50.453807 ip-10-0-135-173 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:50.453807 ip-10-0-135-173 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:43:50.453807 ip-10-0-135-173 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:50.455018 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.454706 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:43:50.459132 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459116 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:50.459132 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459132 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459136 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459139 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459143 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459146 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459150 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459153 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459156 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459160 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459163 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459166 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459169 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459171 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459174 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459177 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459179 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459182 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459185 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459187 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459190 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:50.459202 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459198 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459201 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459203 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459207 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459209 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459212 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459215 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459218 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459221 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459223 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459226 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459228 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459231 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459233 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459236 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459239 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459241 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459243 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459246 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459248 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:50.459683 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459251 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459253 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459256 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459259 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459262 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459264 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459267 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459269 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459272 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459277 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459280 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459283 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459285 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459288 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459292 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459295 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459298 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459301 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459303 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459305 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:50.460167 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459308 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459311 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459313 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459316 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459318 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459321 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459324 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459326 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459329 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459332 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459335 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459339 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459341 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459345 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459348 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459350 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459353 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459356 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459358 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:50.460671 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459360 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459363 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459365 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459368 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459371 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.459374 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460708 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460715 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460719 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460722 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460724 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460727 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460729 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460732 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460734 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460738 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460740 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460743 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460749 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460752 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:50.461175 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460754 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460757 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460759 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460762 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460764 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460767 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460769 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460772 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460774 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460776 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460779 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460781 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460784 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460786 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460789 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460792 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460795 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460797 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460802 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:50.461680 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460805 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460808 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460811 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460813 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460816 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460819 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460822 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460824 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460827 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460829 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460832 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460834 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460837 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460840 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460842 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460845 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460847 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460850 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460852 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460855 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:50.462150 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460857 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460860 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460862 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460864 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460867 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460870 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460873 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460878 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460880 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460883 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460886 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460888 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460891 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460893 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460896 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460900 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460903 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460906 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460909 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:50.462648 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460911 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460914 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460916 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460919 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460921 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460924 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460934 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460937 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460940 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460943 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460945 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460948 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460950 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.460952 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461045 2573 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461053 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461060 2573 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461064 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461070 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461076 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461081 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:43:50.463119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461088 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461091 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461094 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461098 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461102 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461105 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461108 2573 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461111 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461114 2573 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461117 2573 flags.go:64] FLAG: --cloud-config="" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461119 2573 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461122 2573 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461131 2573 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461134 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461137 2573 flags.go:64] FLAG: --config-dir="" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461140 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461144 2573 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461148 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461151 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461154 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461158 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461161 2573 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461165 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461168 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461171 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:43:50.463664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461174 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461178 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461181 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461185 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461188 2573 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461191 2573 flags.go:64] FLAG: --enable-server="true" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461194 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461199 2573 flags.go:64] FLAG: --event-burst="100" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461202 2573 flags.go:64] FLAG: --event-qps="50" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461205 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461209 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461212 2573 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461216 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461219 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461222 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461226 2573 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461228 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461231 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461234 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461237 2573 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461240 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461243 2573 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461246 2573 flags.go:64] FLAG: --feature-gates="" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461250 2573 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461253 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:43:50.464272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461257 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461260 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461263 2573 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461266 2573 flags.go:64] FLAG: --help="false" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461269 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461272 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461275 2573 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461278 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461281 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461285 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461287 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461290 2573 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461293 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461297 2573 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461300 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461303 2573 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461306 2573 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461309 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461312 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461316 2573 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461319 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461322 2573 flags.go:64] FLAG: --lock-file="" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461324 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461327 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:43:50.465012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461330 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461336 2573 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461339 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461341 2573 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461344 2573 flags.go:64] FLAG: --logging-format="text" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461347 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461350 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461353 2573 flags.go:64] FLAG: --manifest-url="" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461356 2573 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461361 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461364 2573 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461369 2573 flags.go:64] FLAG: --max-pods="110" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461374 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461377 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461381 2573 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461384 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461399 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461403 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461406 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461414 2573 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461417 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461424 2573 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461427 2573 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:43:50.465634 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461430 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461436 2573 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461439 2573 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461442 2573 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461445 2573 flags.go:64] FLAG: --port="10250" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461449 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461452 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03f9f8d43c0157247" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461455 2573 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461458 2573 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461461 2573 flags.go:64] FLAG: --register-node="true" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461464 2573 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461466 2573 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461470 2573 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461473 2573 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461476 2573 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461479 2573 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461483 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461486 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461489 2573 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461492 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461495 2573 flags.go:64] FLAG: --runonce="false" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461500 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461503 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461506 2573 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461509 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461512 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:43:50.466228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461514 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461518 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461522 2573 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461525 2573 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461528 2573 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461532 2573 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461535 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461538 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461541 2573 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461544 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461549 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461552 2573 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461555 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461560 2573 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461563 2573 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461565 2573 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461568 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461571 2573 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461574 2573 flags.go:64] FLAG: --v="2" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461578 2573 flags.go:64] FLAG: --version="false" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461582 2573 flags.go:64] FLAG: --vmodule="" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461587 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461590 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461692 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461695 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:50.466863 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461698 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461702 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461707 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461710 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461712 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461715 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461718 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461721 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461724 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461726 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461729 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461731 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461734 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461737 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461740 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461743 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461745 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461748 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461750 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461753 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:50.467506 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461755 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461758 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461760 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461763 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461765 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461768 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461771 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461774 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461776 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461779 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461781 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461784 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461786 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461788 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461792 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461795 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461798 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461800 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461803 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461805 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:50.468028 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461808 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461810 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461813 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461815 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461818 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461820 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461823 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461826 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461829 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461831 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461834 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461836 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461839 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461842 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461846 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461850 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461854 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461857 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461860 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:50.468538 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461862 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461865 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461867 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461870 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461873 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461875 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461878 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461882 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461884 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461888 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461890 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461893 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461895 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461898 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461900 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461903 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461905 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461909 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461911 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:50.469013 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461914 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461917 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461919 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461922 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461924 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.461927 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.461932 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.468350 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.468367 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468434 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468440 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468443 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468446 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468450 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468452 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468455 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:50.469498 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468458 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468460 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468463 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468466 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468468 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468471 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468474 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468477 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468479 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468482 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468484 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468487 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468489 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468492 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468494 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468497 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468499 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468502 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468504 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468507 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:50.469907 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468509 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468512 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468514 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468517 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468521 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468524 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468527 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468529 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468532 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468534 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468537 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468539 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468542 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468547 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468551 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468554 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468557 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468560 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468562 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:50.470398 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468565 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468568 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468571 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468573 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468576 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468578 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468581 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468583 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468586 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468589 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468591 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468593 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468596 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468599 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468601 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468603 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468607 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468609 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468613 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468616 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:50.470855 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468619 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468622 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468624 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468627 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468629 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468632 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468634 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468637 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468639 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468642 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468644 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468648 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468653 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468656 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468659 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468662 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468665 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468667 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468670 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:50.471378 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468673 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.468678 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468797 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468803 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468806 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468809 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468812 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468814 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468817 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468820 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468823 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468826 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468829 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468831 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468834 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:50.471845 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468836 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468839 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468843 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468847 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468850 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468853 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468856 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468860 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468862 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468865 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468868 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468870 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468873 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468875 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468878 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468880 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468883 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468885 2573 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468888 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:50.472220 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468891 2573 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468893 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468896 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468899 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468901 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468904 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468906 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468909 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468911 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468914 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468917 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468920 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468923 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468925 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468928 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468931 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468933 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468936 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468938 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468941 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:50.472698 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468943 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468946 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468948 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468950 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468953 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468955 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468958 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468961 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468963 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468965 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468968 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468970 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468973 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468975 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468978 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468980 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468983 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468985 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468988 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468990 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:50.473189 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468993 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468996 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.468998 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469001 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469004 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469006 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469009 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469012 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469014 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469017 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469019 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469022 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469025 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:50.469027 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.469032 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:50.473690 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.469154 2573 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:43:50.474052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.471267 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:43:50.474052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.472286 2573 server.go:1019] "Starting client certificate rotation" Mar 18 16:43:50.474052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.472383 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:43:50.474052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.473222 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:43:50.499018 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.498997 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:43:50.503231 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.503198 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:43:50.518750 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.518729 2573 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:43:50.524327 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.524314 2573 log.go:25] "Validated CRI v1 image API" Mar 18 16:43:50.525574 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.525545 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:43:50.528598 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.528576 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 ae595271-ab25-4a7d-9d83-719ad6f04b4c:/dev/nvme0n1p3 dd5f5c86-259f-490f-9e8b-aedfcf8e3878:/dev/nvme0n1p4] Mar 18 16:43:50.528669 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.528597 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:43:50.531680 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.531661 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:43:50.534894 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.534782 2573 manager.go:217] Machine: {Timestamp:2026-03-18 16:43:50.532690437 +0000 UTC m=+0.403314606 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2500004 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec270925cf86e1ff96cad9819cab01d0 SystemUUID:ec270925-cf86-e1ff-96ca-d9819cab01d0 BootID:6f8bc6b1-e870-4a5a-9616-f99aee572aea Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:39:e6:45:35:e5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:39:e6:45:35:e5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:44:1e:04:ee:29 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:43:50.534894 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.534889 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:43:50.535016 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.534985 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:43:50.535909 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.535888 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:43:50.536055 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.535913 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-173.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:43:50.536104 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.536065 2573 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:43:50.536104 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.536074 2573 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:43:50.536104 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.536088 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:43:50.536673 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.536663 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:43:50.537916 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.537906 2573 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:43:50.538036 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.538028 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:43:50.542120 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.542110 2573 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:43:50.542157 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.542126 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:43:50.542157 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.542142 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:43:50.542157 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.542151 2573 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:43:50.542278 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.542163 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:43:50.543285 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.543269 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:43:50.543363 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.543297 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:43:50.546475 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.546458 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:43:50.550864 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.550842 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:43:50.552580 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552557 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:43:50.552580 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552581 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552588 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552593 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552599 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552605 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552611 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552617 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552624 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552629 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552638 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:43:50.552712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.552647 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:43:50.553511 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.553499 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:43:50.553552 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.553513 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:43:50.554188 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.554163 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-173.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:43:50.554280 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.554263 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:43:50.555755 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.555739 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9d5mv" Mar 18 16:43:50.557141 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.557127 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:43:50.557207 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.557161 2573 server.go:1295] "Started kubelet" Mar 18 16:43:50.557276 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.557249 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:43:50.557326 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.557272 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:43:50.557370 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.557352 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:43:50.558137 ip-10-0-135-173 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:43:50.558423 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.558406 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:43:50.558499 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.558428 2573 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:43:50.561224 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.561204 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-173.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 16:43:50.561994 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.561059 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-173.ec2.internal.189dfd34966480a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-173.ec2.internal,UID:ip-10-0-135-173.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-173.ec2.internal,},FirstTimestamp:2026-03-18 16:43:50.557139107 +0000 UTC m=+0.427763277,LastTimestamp:2026-03-18 16:43:50.557139107 +0000 UTC m=+0.427763277,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-173.ec2.internal,}" Mar 18 16:43:50.564307 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.564291 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:43:50.564899 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.564867 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:43:50.565651 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.565626 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:43:50.565727 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.565659 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:43:50.565776 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.565753 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:43:50.565835 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.565803 2573 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:43:50.565835 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.565803 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9d5mv" Mar 18 16:43:50.565835 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.565815 2573 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:43:50.567102 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.566275 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:50.567687 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.567669 2573 factory.go:55] Registering systemd factory Mar 18 16:43:50.567778 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.567730 2573 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:43:50.567996 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.567952 2573 factory.go:153] Registering CRI-O factory Mar 18 16:43:50.567996 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.567973 2573 factory.go:223] Registration of the crio container factory successfully Mar 18 16:43:50.568106 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.568034 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:43:50.568106 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.568061 2573 factory.go:103] Registering Raw factory Mar 18 16:43:50.568106 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.568075 2573 manager.go:1196] Started watching for new ooms in manager Mar 18 16:43:50.568538 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.568520 2573 manager.go:319] Starting recovery of all containers Mar 18 16:43:50.568914 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.568894 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:43:50.572007 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.571986 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:50.575682 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.575655 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-173.ec2.internal\" not found" node="ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.579940 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.579787 2573 manager.go:324] Recovery completed Mar 18 16:43:50.581120 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.581095 2573 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Mar 18 16:43:50.584064 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.584052 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.586384 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.586367 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.586475 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.586441 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.586475 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.586456 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.586966 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.586950 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:43:50.587033 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.586966 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:43:50.587033 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.586984 2573 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:43:50.588123 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.588113 2573 policy_none.go:49] "None policy: Start" Mar 18 16:43:50.588156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.588127 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:43:50.588156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.588136 2573 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.627833 2573 manager.go:341] "Starting Device Plugin manager" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.627874 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.627888 2573 server.go:85] "Starting device plugin registration server" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.628144 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.628156 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.628243 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.628330 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.628339 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.628918 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:43:50.639686 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.628961 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:50.708262 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.708174 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:43:50.709318 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.709304 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:43:50.709366 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.709338 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:43:50.709366 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.709359 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:43:50.709366 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.709366 2573 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:43:50.709505 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.709412 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:43:50.712361 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.712342 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:50.728683 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.728660 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.734186 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.734171 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.734285 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.734199 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.734285 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.734209 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.734285 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.734233 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.743201 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.743184 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.743257 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.743206 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-173.ec2.internal\": node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:50.767757 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.767735 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:50.810558 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.810512 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal"] Mar 18 16:43:50.810704 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.810621 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.812383 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.812367 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.812499 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.812415 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.812499 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.812430 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.813950 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.813935 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.814095 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.814081 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.814143 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.814139 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.815220 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.815205 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.815220 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.815229 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.815365 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.815242 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.815439 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.815404 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.815439 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.815434 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.815531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.815452 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.816631 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.816618 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.816707 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.816641 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.818044 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.818025 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.818115 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.818055 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.818115 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.818071 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.839018 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.838994 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-173.ec2.internal\" not found" node="ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.843660 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.843641 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-173.ec2.internal\" not found" node="ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.867464 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.867434 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bd2e7dc4e38c8db4728651f87a3ad871-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal\" (UID: \"bd2e7dc4e38c8db4728651f87a3ad871\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.867464 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.867465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd2e7dc4e38c8db4728651f87a3ad871-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal\" (UID: \"bd2e7dc4e38c8db4728651f87a3ad871\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.867620 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.867485 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0b49064290dc49dad3af00f5cc25e86d-config\") pod \"kube-apiserver-proxy-ip-10-0-135-173.ec2.internal\" (UID: \"0b49064290dc49dad3af00f5cc25e86d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.867990 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.867975 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:50.968199 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:50.968111 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:50.968319 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.968214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bd2e7dc4e38c8db4728651f87a3ad871-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal\" (UID: \"bd2e7dc4e38c8db4728651f87a3ad871\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.968319 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.968244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd2e7dc4e38c8db4728651f87a3ad871-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal\" (UID: \"bd2e7dc4e38c8db4728651f87a3ad871\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.968319 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.968262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0b49064290dc49dad3af00f5cc25e86d-config\") pod \"kube-apiserver-proxy-ip-10-0-135-173.ec2.internal\" (UID: \"0b49064290dc49dad3af00f5cc25e86d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.968319 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.968305 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/bd2e7dc4e38c8db4728651f87a3ad871-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal\" (UID: \"bd2e7dc4e38c8db4728651f87a3ad871\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.968478 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.968318 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0b49064290dc49dad3af00f5cc25e86d-config\") pod \"kube-apiserver-proxy-ip-10-0-135-173.ec2.internal\" (UID: \"0b49064290dc49dad3af00f5cc25e86d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal" Mar 18 16:43:50.968478 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:50.968343 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd2e7dc4e38c8db4728651f87a3ad871-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal\" (UID: \"bd2e7dc4e38c8db4728651f87a3ad871\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" Mar 18 16:43:51.068853 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:51.068809 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:51.142028 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.141993 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" Mar 18 16:43:51.145524 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.145506 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal" Mar 18 16:43:51.168916 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:51.168883 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:51.269367 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:51.269288 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:51.369679 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:51.369650 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:51.470078 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:51.470049 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:51.472215 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.472189 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:43:51.472348 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.472334 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:43:51.472348 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.472339 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:43:51.565114 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.565040 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:43:51.569070 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.569032 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:38:50 +0000 UTC" deadline="2027-12-31 07:29:07.34747888 +0000 UTC" Mar 18 16:43:51.569070 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.569065 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15662h45m15.778416633s" Mar 18 16:43:51.570617 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:51.570600 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:51.577967 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.577951 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:43:51.602560 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.602525 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mkcpr" Mar 18 16:43:51.608720 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.608692 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mkcpr" Mar 18 16:43:51.632067 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.632047 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:51.670846 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:51.670817 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:51.736406 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:51.736358 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b49064290dc49dad3af00f5cc25e86d.slice/crio-6d80d6d2f98ce0f3703c2cc5608c7fb6c6dc1c74fb55232a6451b434f2720580 WatchSource:0}: Error finding container 6d80d6d2f98ce0f3703c2cc5608c7fb6c6dc1c74fb55232a6451b434f2720580: Status 404 returned error can't find the container with id 6d80d6d2f98ce0f3703c2cc5608c7fb6c6dc1c74fb55232a6451b434f2720580 Mar 18 16:43:51.740147 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.740112 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:43:51.759965 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:51.759940 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd2e7dc4e38c8db4728651f87a3ad871.slice/crio-9420cd66d328c9c0488aba2a65273a083ce25f79edd047fdaa50cd771e5936ae WatchSource:0}: Error finding container 9420cd66d328c9c0488aba2a65273a083ce25f79edd047fdaa50cd771e5936ae: Status 404 returned error can't find the container with id 9420cd66d328c9c0488aba2a65273a083ce25f79edd047fdaa50cd771e5936ae Mar 18 16:43:51.771788 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:51.771765 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-173.ec2.internal\" not found" Mar 18 16:43:51.853432 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.853385 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:51.866103 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.866080 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" Mar 18 16:43:51.881324 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.881305 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:43:51.882141 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.882129 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal" Mar 18 16:43:51.890471 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:51.890454 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:43:52.384377 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.384234 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:52.388623 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.388433 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:52.542845 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.542809 2573 apiserver.go:52] "Watching apiserver" Mar 18 16:43:52.548338 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.548311 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:43:52.548858 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.548833 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-82l6k","openshift-network-diagnostics/network-check-target-l99p6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal","openshift-multus/network-metrics-daemon-97pm9","openshift-network-operator/iptables-alerter-6dkdp","openshift-ovn-kubernetes/ovnkube-node-5pfhr","kube-system/konnectivity-agent-bl97m","kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal","openshift-cluster-node-tuning-operator/tuned-s92ms","openshift-dns/node-resolver-5nrz9","openshift-image-registry/node-ca-wgm8d","openshift-multus/multus-4khsw"] Mar 18 16:43:52.553013 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.552992 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:43:52.555336 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.555103 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.555977 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.555954 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:43:52.556068 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.556013 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:43:52.556068 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.556029 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q6bb2\"" Mar 18 16:43:52.556881 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.556863 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:43:52.557428 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.557384 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.557525 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.557500 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:43:52.558181 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.558162 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.558663 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.558646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bnwwn\"" Mar 18 16:43:52.558940 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.558922 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.559298 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.559278 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:43:52.559513 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.559494 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.559988 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.559846 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.560246 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.560130 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9rc74\"" Mar 18 16:43:52.561063 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.561044 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.561292 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.561255 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.561828 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.561694 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.561828 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.561711 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wmhkw\"" Mar 18 16:43:52.564445 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.564018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:52.564445 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:52.564084 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:43:52.564445 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.564180 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.566526 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.566507 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.566962 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.566942 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:43:52.567452 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.567433 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rfqzx\"" Mar 18 16:43:52.567555 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.567515 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.568223 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.568208 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:43:52.569603 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.569373 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.571745 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.571720 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:43:52.571827 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.571808 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:43:52.571877 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.571810 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.571877 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.571865 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.571962 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.571952 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:43:52.572194 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.572174 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.572249 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.572196 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.572379 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.572353 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kjv5x\"" Mar 18 16:43:52.575139 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.574974 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.575470 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.575451 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:43:52.575738 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.575719 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:43:52.575822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.575795 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.577254 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-etc-selinux\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.577348 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577268 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-var-lib-cni-bin\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.577348 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577293 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-etc-kubernetes\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.577348 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577317 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-sysctl-conf\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.577348 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577321 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:43:52.577348 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577343 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7r4v\" (UniqueName: \"kubernetes.io/projected/988c8f30-310e-4643-bf4c-f424b7d7c8ce-kube-api-access-k7r4v\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.577566 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577368 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-multus-cni-dir\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.577566 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:52.577381 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:43:52.577712 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-socket-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.577782 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-modprobe-d\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.577782 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577771 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-run-netns\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.577883 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-cni-netd\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.577883 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/988c8f30-310e-4643-bf4c-f424b7d7c8ce-ovnkube-script-lib\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.577883 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-systemd\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.578022 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-run-systemd\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.578022 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577931 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-var-lib-openvswitch\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.578022 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.577980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-node-log\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.578022 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578010 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/988c8f30-310e-4643-bf4c-f424b7d7c8ce-env-overrides\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.578209 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce-konnectivity-ca\") pod \"konnectivity-agent-bl97m\" (UID: \"53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce\") " pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:43:52.578209 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578080 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-hostroot\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.578209 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-multus-conf-dir\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.578209 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-slash\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.578209 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578178 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-run-openvswitch\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/19928fae-37e8-4123-9e56-7cc4713544ee-multus-daemon-config\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h65pz\" (UniqueName: \"kubernetes.io/projected/19928fae-37e8-4123-9e56-7cc4713544ee-kube-api-access-h65pz\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-device-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578279 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blk87\" (UniqueName: \"kubernetes.io/projected/bcfba111-b183-4425-9a51-61c7760fa04b-kube-api-access-blk87\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578325 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-host\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578349 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-kubernetes\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578375 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-kubelet\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-etc-openvswitch\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578459 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-run-multus-certs\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-run-ovn\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-lib-modules\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.578529 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578532 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-sysctl-d\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578555 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-tmp\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj22z\" (UniqueName: \"kubernetes.io/projected/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-kube-api-access-vj22z\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgdx8\" (UniqueName: \"kubernetes.io/projected/f9982fef-c82a-4b1f-8622-337551d7ec32-kube-api-access-wgdx8\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578627 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/48146439-50bf-4967-ae3b-86c4c7ea0c9d-iptables-alerter-script\") pod \"iptables-alerter-6dkdp\" (UID: \"48146439-50bf-4967-ae3b-86c4c7ea0c9d\") " pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-sysconfig\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-system-cni-dir\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-cnibin\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578751 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-os-release\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578775 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19928fae-37e8-4123-9e56-7cc4713544ee-cni-binary-copy\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-run-netns\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-sys-fs\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.578900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-var-lib-kubelet\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579004 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-var-lib-cni-multus\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-var-lib-kubelet\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-sys\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.579216 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm74m\" (UniqueName: \"kubernetes.io/projected/48146439-50bf-4967-ae3b-86c4c7ea0c9d-kube-api-access-zm74m\") pod \"iptables-alerter-6dkdp\" (UID: \"48146439-50bf-4967-ae3b-86c4c7ea0c9d\") " pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579158 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-run\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579205 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-systemd-units\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-cni-bin\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579276 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/988c8f30-310e-4643-bf4c-f424b7d7c8ce-ovnkube-config\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579303 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce-agent-certs\") pod \"konnectivity-agent-bl97m\" (UID: \"53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce\") " pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-registration-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-log-socket\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579375 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579413 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/988c8f30-310e-4643-bf4c-f424b7d7c8ce-ovn-node-metrics-cert\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48146439-50bf-4967-ae3b-86c4c7ea0c9d-host-slash\") pod \"iptables-alerter-6dkdp\" (UID: \"48146439-50bf-4967-ae3b-86c4c7ea0c9d\") " pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579459 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-multus-socket-dir-parent\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579482 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-run-k8s-cni-cncf-io\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579507 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-tuned\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.580158 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.579550 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.580822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.580472 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-84sd2\"" Mar 18 16:43:52.580822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.580616 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.580822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.580638 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:43:52.580822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.580656 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:43:52.580822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.580689 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lstmt\"" Mar 18 16:43:52.580822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.580760 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:43:52.580822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.580804 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7z5fg\"" Mar 18 16:43:52.609930 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.609893 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:38:51 +0000 UTC" deadline="2027-08-29 11:19:52.736576337 +0000 UTC" Mar 18 16:43:52.609930 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.609928 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12690h36m0.126652396s" Mar 18 16:43:52.667100 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.667032 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:43:52.680156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-sysctl-conf\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.680156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680163 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7r4v\" (UniqueName: \"kubernetes.io/projected/988c8f30-310e-4643-bf4c-f424b7d7c8ce-kube-api-access-k7r4v\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680185 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-multus-cni-dir\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-socket-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680237 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e6826d5-ecdc-4d2e-97c7-fbe508364d90-host\") pod \"node-ca-wgm8d\" (UID: \"6e6826d5-ecdc-4d2e-97c7-fbe508364d90\") " pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qrdf\" (UniqueName: \"kubernetes.io/projected/6e6826d5-ecdc-4d2e-97c7-fbe508364d90-kube-api-access-8qrdf\") pod \"node-ca-wgm8d\" (UID: \"6e6826d5-ecdc-4d2e-97c7-fbe508364d90\") " pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680275 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-modprobe-d\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-run-netns\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-cni-netd\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680321 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-sysctl-conf\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/988c8f30-310e-4643-bf4c-f424b7d7c8ce-ovnkube-script-lib\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680367 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-systemd\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.680415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-run-systemd\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-var-lib-openvswitch\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-socket-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-run-netns\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-run-systemd\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680487 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-systemd\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680491 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-multus-cni-dir\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680514 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-var-lib-openvswitch\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680530 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-cni-netd\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680530 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-modprobe-d\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-node-log\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/988c8f30-310e-4643-bf4c-f424b7d7c8ce-env-overrides\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680613 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce-konnectivity-ca\") pod \"konnectivity-agent-bl97m\" (UID: \"53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce\") " pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680621 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-node-log\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680638 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-hostroot\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-multus-conf-dir\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-slash\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-run-openvswitch\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.680918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/19928fae-37e8-4123-9e56-7cc4713544ee-multus-daemon-config\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h65pz\" (UniqueName: \"kubernetes.io/projected/19928fae-37e8-4123-9e56-7cc4713544ee-kube-api-access-h65pz\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680789 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-device-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blk87\" (UniqueName: \"kubernetes.io/projected/bcfba111-b183-4425-9a51-61c7760fa04b-kube-api-access-blk87\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680843 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8f0f214c-5e39-41f2-90e1-683e89ac4db2-hosts-file\") pod \"node-resolver-5nrz9\" (UID: \"8f0f214c-5e39-41f2-90e1-683e89ac4db2\") " pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680868 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e6826d5-ecdc-4d2e-97c7-fbe508364d90-serviceca\") pod \"node-ca-wgm8d\" (UID: \"6e6826d5-ecdc-4d2e-97c7-fbe508364d90\") " pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-host\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-kubernetes\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.680981 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-kubelet\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-etc-openvswitch\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-run-multus-certs\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681099 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bd66593-b926-4351-809d-710aff145026-cni-binary-copy\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-tuning-conf-dir\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4bd66593-b926-4351-809d-710aff145026-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/988c8f30-310e-4643-bf4c-f424b7d7c8ce-ovnkube-script-lib\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:43:52.681758 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681156 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-hostroot\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-run-ovn\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681192 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-multus-conf-dir\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-lib-modules\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681224 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-sysctl-d\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-tmp\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj22z\" (UniqueName: \"kubernetes.io/projected/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-kube-api-access-vj22z\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgdx8\" (UniqueName: \"kubernetes.io/projected/f9982fef-c82a-4b1f-8622-337551d7ec32-kube-api-access-wgdx8\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/48146439-50bf-4967-ae3b-86c4c7ea0c9d-iptables-alerter-script\") pod \"iptables-alerter-6dkdp\" (UID: \"48146439-50bf-4967-ae3b-86c4c7ea0c9d\") " pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-sysconfig\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681326 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-system-cni-dir\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-cnibin\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce-konnectivity-ca\") pod \"konnectivity-agent-bl97m\" (UID: \"53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce\") " pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-os-release\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19928fae-37e8-4123-9e56-7cc4713544ee-cni-binary-copy\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-run-netns\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-sys-fs\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681475 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-slash\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.682476 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-run-openvswitch\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-cnibin\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-host\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:52.681955 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-var-lib-kubelet\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:52.682190 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs podName:f9982fef-c82a-4b1f-8622-337551d7ec32 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.182147087 +0000 UTC m=+3.052771256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs") pod "network-metrics-daemon-97pm9" (UID: "f9982fef-c82a-4b1f-8622-337551d7ec32") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682209 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-device-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682335 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-kubernetes\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/19928fae-37e8-4123-9e56-7cc4713544ee-multus-daemon-config\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682407 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-var-lib-cni-multus\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-kubelet\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-var-lib-kubelet\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682485 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-var-lib-kubelet\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-sys\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-etc-openvswitch\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.683271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-system-cni-dir\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm74m\" (UniqueName: \"kubernetes.io/projected/48146439-50bf-4967-ae3b-86c4c7ea0c9d-kube-api-access-zm74m\") pod \"iptables-alerter-6dkdp\" (UID: \"48146439-50bf-4967-ae3b-86c4c7ea0c9d\") " pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682454 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-var-lib-cni-multus\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-run-multus-certs\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682659 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-sys\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.681098 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/988c8f30-310e-4643-bf4c-f424b7d7c8ce-env-overrides\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-run\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.682928 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683364 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/48146439-50bf-4967-ae3b-86c4c7ea0c9d-iptables-alerter-script\") pod \"iptables-alerter-6dkdp\" (UID: \"48146439-50bf-4967-ae3b-86c4c7ea0c9d\") " pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683549 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-run-ovn\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-sysctl-d\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683632 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-os-release\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683653 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-cnibin\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683654 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-run-netns\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683735 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-run\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-sys-fs\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-systemd-units\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-lib-modules\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.684021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-cni-bin\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683945 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-var-lib-kubelet\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.683996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/988c8f30-310e-4643-bf4c-f424b7d7c8ce-ovnkube-config\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684037 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce-agent-certs\") pod \"konnectivity-agent-bl97m\" (UID: \"53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce\") " pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-registration-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19928fae-37e8-4123-9e56-7cc4713544ee-cni-binary-copy\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld25m\" (UniqueName: \"kubernetes.io/projected/4bd66593-b926-4351-809d-710aff145026-kube-api-access-ld25m\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684231 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-log-socket\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684259 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-systemd-units\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/988c8f30-310e-4643-bf4c-f424b7d7c8ce-ovn-node-metrics-cert\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48146439-50bf-4967-ae3b-86c4c7ea0c9d-host-slash\") pod \"iptables-alerter-6dkdp\" (UID: \"48146439-50bf-4967-ae3b-86c4c7ea0c9d\") " pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684437 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-sysconfig\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-multus-socket-dir-parent\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684373 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-run-k8s-cni-cncf-io\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684655 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-log-socket\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.684907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684672 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8f0f214c-5e39-41f2-90e1-683e89ac4db2-tmp-dir\") pod \"node-resolver-5nrz9\" (UID: \"8f0f214c-5e39-41f2-90e1-683e89ac4db2\") " pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684712 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-multus-socket-dir-parent\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-cni-bin\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqf77\" (UniqueName: \"kubernetes.io/projected/8f0f214c-5e39-41f2-90e1-683e89ac4db2-kube-api-access-kqf77\") pod \"node-resolver-5nrz9\" (UID: \"8f0f214c-5e39-41f2-90e1-683e89ac4db2\") " pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-tuned\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-registration-dir\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.684984 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-system-cni-dir\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.685018 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-os-release\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.685013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48146439-50bf-4967-ae3b-86c4c7ea0c9d-host-slash\") pod \"iptables-alerter-6dkdp\" (UID: \"48146439-50bf-4967-ae3b-86c4c7ea0c9d\") " pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.685152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-run-k8s-cni-cncf-io\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.685161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/988c8f30-310e-4643-bf4c-f424b7d7c8ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.685697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.685631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4bd66593-b926-4351-809d-710aff145026-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.686212 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.685743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/988c8f30-310e-4643-bf4c-f424b7d7c8ce-ovnkube-config\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.686212 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.685894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-etc-selinux\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.686212 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.685935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-var-lib-cni-bin\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.686212 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.686014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-etc-kubernetes\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.686212 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.686047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bcfba111-b183-4425-9a51-61c7760fa04b-etc-selinux\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.686212 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.686125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-host-var-lib-cni-bin\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.686212 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.686178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19928fae-37e8-4123-9e56-7cc4713544ee-etc-kubernetes\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.686949 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.686923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-tmp\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.687349 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.687315 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce-agent-certs\") pod \"konnectivity-agent-bl97m\" (UID: \"53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce\") " pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:43:52.687630 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.687611 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/988c8f30-310e-4643-bf4c-f424b7d7c8ce-ovn-node-metrics-cert\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.688790 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.688762 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-etc-tuned\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.689582 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.689560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h65pz\" (UniqueName: \"kubernetes.io/projected/19928fae-37e8-4123-9e56-7cc4713544ee-kube-api-access-h65pz\") pod \"multus-4khsw\" (UID: \"19928fae-37e8-4123-9e56-7cc4713544ee\") " pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.690771 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.690729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7r4v\" (UniqueName: \"kubernetes.io/projected/988c8f30-310e-4643-bf4c-f424b7d7c8ce-kube-api-access-k7r4v\") pod \"ovnkube-node-5pfhr\" (UID: \"988c8f30-310e-4643-bf4c-f424b7d7c8ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.692582 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.692557 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj22z\" (UniqueName: \"kubernetes.io/projected/b084c4bf-7986-4b9a-8c58-c374ef1fcf78-kube-api-access-vj22z\") pod \"tuned-s92ms\" (UID: \"b084c4bf-7986-4b9a-8c58-c374ef1fcf78\") " pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.692999 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.692944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm74m\" (UniqueName: \"kubernetes.io/projected/48146439-50bf-4967-ae3b-86c4c7ea0c9d-kube-api-access-zm74m\") pod \"iptables-alerter-6dkdp\" (UID: \"48146439-50bf-4967-ae3b-86c4c7ea0c9d\") " pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.693248 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.693231 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgdx8\" (UniqueName: \"kubernetes.io/projected/f9982fef-c82a-4b1f-8622-337551d7ec32-kube-api-access-wgdx8\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:52.693433 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.693412 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blk87\" (UniqueName: \"kubernetes.io/projected/bcfba111-b183-4425-9a51-61c7760fa04b-kube-api-access-blk87\") pod \"aws-ebs-csi-driver-node-5cjnk\" (UID: \"bcfba111-b183-4425-9a51-61c7760fa04b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.713804 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.713745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" event={"ID":"bd2e7dc4e38c8db4728651f87a3ad871","Type":"ContainerStarted","Data":"9420cd66d328c9c0488aba2a65273a083ce25f79edd047fdaa50cd771e5936ae"} Mar 18 16:43:52.714845 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.714813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal" event={"ID":"0b49064290dc49dad3af00f5cc25e86d","Type":"ContainerStarted","Data":"6d80d6d2f98ce0f3703c2cc5608c7fb6c6dc1c74fb55232a6451b434f2720580"} Mar 18 16:43:52.786979 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.786945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8f0f214c-5e39-41f2-90e1-683e89ac4db2-hosts-file\") pod \"node-resolver-5nrz9\" (UID: \"8f0f214c-5e39-41f2-90e1-683e89ac4db2\") " pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.786979 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.786978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e6826d5-ecdc-4d2e-97c7-fbe508364d90-serviceca\") pod \"node-ca-wgm8d\" (UID: \"6e6826d5-ecdc-4d2e-97c7-fbe508364d90\") " pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.787190 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787072 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8f0f214c-5e39-41f2-90e1-683e89ac4db2-hosts-file\") pod \"node-resolver-5nrz9\" (UID: \"8f0f214c-5e39-41f2-90e1-683e89ac4db2\") " pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.787190 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bd66593-b926-4351-809d-710aff145026-cni-binary-copy\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787190 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-tuning-conf-dir\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787325 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4bd66593-b926-4351-809d-710aff145026-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787325 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:43:52.787325 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-cnibin\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787325 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld25m\" (UniqueName: \"kubernetes.io/projected/4bd66593-b926-4351-809d-710aff145026-kube-api-access-ld25m\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8f0f214c-5e39-41f2-90e1-683e89ac4db2-tmp-dir\") pod \"node-resolver-5nrz9\" (UID: \"8f0f214c-5e39-41f2-90e1-683e89ac4db2\") " pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787361 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-tuning-conf-dir\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqf77\" (UniqueName: \"kubernetes.io/projected/8f0f214c-5e39-41f2-90e1-683e89ac4db2-kube-api-access-kqf77\") pod \"node-resolver-5nrz9\" (UID: \"8f0f214c-5e39-41f2-90e1-683e89ac4db2\") " pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e6826d5-ecdc-4d2e-97c7-fbe508364d90-serviceca\") pod \"node-ca-wgm8d\" (UID: \"6e6826d5-ecdc-4d2e-97c7-fbe508364d90\") " pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-system-cni-dir\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-os-release\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-cnibin\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4bd66593-b926-4351-809d-710aff145026-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787512 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-system-cni-dir\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787531 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e6826d5-ecdc-4d2e-97c7-fbe508364d90-host\") pod \"node-ca-wgm8d\" (UID: \"6e6826d5-ecdc-4d2e-97c7-fbe508364d90\") " pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.787925 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e6826d5-ecdc-4d2e-97c7-fbe508364d90-host\") pod \"node-ca-wgm8d\" (UID: \"6e6826d5-ecdc-4d2e-97c7-fbe508364d90\") " pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.787925 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qrdf\" (UniqueName: \"kubernetes.io/projected/6e6826d5-ecdc-4d2e-97c7-fbe508364d90-kube-api-access-8qrdf\") pod \"node-ca-wgm8d\" (UID: \"6e6826d5-ecdc-4d2e-97c7-fbe508364d90\") " pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.787925 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4bd66593-b926-4351-809d-710aff145026-os-release\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787925 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bd66593-b926-4351-809d-710aff145026-cni-binary-copy\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787925 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787736 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4bd66593-b926-4351-809d-710aff145026-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.787925 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787791 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8f0f214c-5e39-41f2-90e1-683e89ac4db2-tmp-dir\") pod \"node-resolver-5nrz9\" (UID: \"8f0f214c-5e39-41f2-90e1-683e89ac4db2\") " pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.788090 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.787978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4bd66593-b926-4351-809d-710aff145026-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.793044 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:52.793024 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:52.793044 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:52.793045 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:52.793189 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:52.793057 2573 projected.go:194] Error preparing data for projected volume kube-api-access-fklth for pod openshift-network-diagnostics/network-check-target-l99p6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:52.793189 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:52.793111 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth podName:c2cd13a8-578c-4371-b5e8-7ef2af59364b nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.293098411 +0000 UTC m=+3.163722566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fklth" (UniqueName: "kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth") pod "network-check-target-l99p6" (UID: "c2cd13a8-578c-4371-b5e8-7ef2af59364b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:52.795820 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.795780 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld25m\" (UniqueName: \"kubernetes.io/projected/4bd66593-b926-4351-809d-710aff145026-kube-api-access-ld25m\") pod \"multus-additional-cni-plugins-82l6k\" (UID: \"4bd66593-b926-4351-809d-710aff145026\") " pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.795916 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.795792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqf77\" (UniqueName: \"kubernetes.io/projected/8f0f214c-5e39-41f2-90e1-683e89ac4db2-kube-api-access-kqf77\") pod \"node-resolver-5nrz9\" (UID: \"8f0f214c-5e39-41f2-90e1-683e89ac4db2\") " pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.795991 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.795967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qrdf\" (UniqueName: \"kubernetes.io/projected/6e6826d5-ecdc-4d2e-97c7-fbe508364d90-kube-api-access-8qrdf\") pod \"node-ca-wgm8d\" (UID: \"6e6826d5-ecdc-4d2e-97c7-fbe508364d90\") " pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:52.865513 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.865468 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:43:52.874500 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.874474 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4khsw" Mar 18 16:43:52.883119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.883092 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" Mar 18 16:43:52.890936 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.890901 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s92ms" Mar 18 16:43:52.897918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.897895 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:43:52.906482 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.906459 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6dkdp" Mar 18 16:43:52.921135 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.921076 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-82l6k" Mar 18 16:43:52.928725 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.928702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5nrz9" Mar 18 16:43:52.934251 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:52.934232 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wgm8d" Mar 18 16:43:53.190059 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.189968 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:53.190213 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:53.190128 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:53.190213 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:53.190210 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs podName:f9982fef-c82a-4b1f-8622-337551d7ec32 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:54.190190532 +0000 UTC m=+4.060814686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs") pod "network-metrics-daemon-97pm9" (UID: "f9982fef-c82a-4b1f-8622-337551d7ec32") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:53.355915 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:53.355887 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcfba111_b183_4425_9a51_61c7760fa04b.slice/crio-5e1cfc7a8f692146ad2d8bdc836b2e07db4eef9e807f2ab9cf033c7575b07b8a WatchSource:0}: Error finding container 5e1cfc7a8f692146ad2d8bdc836b2e07db4eef9e807f2ab9cf033c7575b07b8a: Status 404 returned error can't find the container with id 5e1cfc7a8f692146ad2d8bdc836b2e07db4eef9e807f2ab9cf033c7575b07b8a Mar 18 16:43:53.357668 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:53.357643 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53c4dcc2_c3ef_42fa_9b56_1162b7e8fbce.slice/crio-163a9b908731e8517dc3ce9f3a91ff6ec2fa4dde26f042679890fd8d565ab843 WatchSource:0}: Error finding container 163a9b908731e8517dc3ce9f3a91ff6ec2fa4dde26f042679890fd8d565ab843: Status 404 returned error can't find the container with id 163a9b908731e8517dc3ce9f3a91ff6ec2fa4dde26f042679890fd8d565ab843 Mar 18 16:43:53.358260 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:53.358236 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bd66593_b926_4351_809d_710aff145026.slice/crio-29a4823d213191b02d054e58211d227ec410ba14c8fb51273f8da01d1c7d6dda WatchSource:0}: Error finding container 29a4823d213191b02d054e58211d227ec410ba14c8fb51273f8da01d1c7d6dda: Status 404 returned error can't find the container with id 29a4823d213191b02d054e58211d227ec410ba14c8fb51273f8da01d1c7d6dda Mar 18 16:43:53.359677 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:53.359595 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f0f214c_5e39_41f2_90e1_683e89ac4db2.slice/crio-2db3759ee1124f94719b30627b70ccb9b3958957628d12c1351a5a825679e07c WatchSource:0}: Error finding container 2db3759ee1124f94719b30627b70ccb9b3958957628d12c1351a5a825679e07c: Status 404 returned error can't find the container with id 2db3759ee1124f94719b30627b70ccb9b3958957628d12c1351a5a825679e07c Mar 18 16:43:53.361802 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:53.361778 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e6826d5_ecdc_4d2e_97c7_fbe508364d90.slice/crio-a51f3e0eb7c673de820effa7ee4295807bb53991692339968e2d995a94cf45a5 WatchSource:0}: Error finding container a51f3e0eb7c673de820effa7ee4295807bb53991692339968e2d995a94cf45a5: Status 404 returned error can't find the container with id a51f3e0eb7c673de820effa7ee4295807bb53991692339968e2d995a94cf45a5 Mar 18 16:43:53.362790 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:53.362767 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod988c8f30_310e_4643_bf4c_f424b7d7c8ce.slice/crio-39d41aad487f21e2f452612a9d04409fa0fa71fc3b0519275468267c5de17bb4 WatchSource:0}: Error finding container 39d41aad487f21e2f452612a9d04409fa0fa71fc3b0519275468267c5de17bb4: Status 404 returned error can't find the container with id 39d41aad487f21e2f452612a9d04409fa0fa71fc3b0519275468267c5de17bb4 Mar 18 16:43:53.363879 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:53.363850 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19928fae_37e8_4123_9e56_7cc4713544ee.slice/crio-c9f5a17cac7fcda10f606381c54780cb26cc36dcfe579369a9cf64b76eb7931f WatchSource:0}: Error finding container c9f5a17cac7fcda10f606381c54780cb26cc36dcfe579369a9cf64b76eb7931f: Status 404 returned error can't find the container with id c9f5a17cac7fcda10f606381c54780cb26cc36dcfe579369a9cf64b76eb7931f Mar 18 16:43:53.365224 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:43:53.365108 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb084c4bf_7986_4b9a_8c58_c374ef1fcf78.slice/crio-e08f2c1e7296bd869dddfedbf0f10c98d6dc60758309316bfdd6d1a160b0d2d9 WatchSource:0}: Error finding container e08f2c1e7296bd869dddfedbf0f10c98d6dc60758309316bfdd6d1a160b0d2d9: Status 404 returned error can't find the container with id e08f2c1e7296bd869dddfedbf0f10c98d6dc60758309316bfdd6d1a160b0d2d9 Mar 18 16:43:53.391435 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.391411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:43:53.391604 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:53.391587 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:53.391667 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:53.391611 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:53.391667 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:53.391625 2573 projected.go:194] Error preparing data for projected volume kube-api-access-fklth for pod openshift-network-diagnostics/network-check-target-l99p6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:53.391757 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:53.391684 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth podName:c2cd13a8-578c-4371-b5e8-7ef2af59364b nodeName:}" failed. No retries permitted until 2026-03-18 16:43:54.391664682 +0000 UTC m=+4.262288851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fklth" (UniqueName: "kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth") pod "network-check-target-l99p6" (UID: "c2cd13a8-578c-4371-b5e8-7ef2af59364b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:53.610787 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.610548 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:38:51 +0000 UTC" deadline="2027-08-26 18:33:19.014121043 +0000 UTC" Mar 18 16:43:53.610787 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.610783 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12625h49m25.403342145s" Mar 18 16:43:53.717888 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.717848 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" event={"ID":"988c8f30-310e-4643-bf4c-f424b7d7c8ce","Type":"ContainerStarted","Data":"39d41aad487f21e2f452612a9d04409fa0fa71fc3b0519275468267c5de17bb4"} Mar 18 16:43:53.719187 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.719148 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wgm8d" event={"ID":"6e6826d5-ecdc-4d2e-97c7-fbe508364d90","Type":"ContainerStarted","Data":"a51f3e0eb7c673de820effa7ee4295807bb53991692339968e2d995a94cf45a5"} Mar 18 16:43:53.720375 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.720348 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bl97m" event={"ID":"53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce","Type":"ContainerStarted","Data":"163a9b908731e8517dc3ce9f3a91ff6ec2fa4dde26f042679890fd8d565ab843"} Mar 18 16:43:53.721351 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.721319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" event={"ID":"bcfba111-b183-4425-9a51-61c7760fa04b","Type":"ContainerStarted","Data":"5e1cfc7a8f692146ad2d8bdc836b2e07db4eef9e807f2ab9cf033c7575b07b8a"} Mar 18 16:43:53.723140 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.723115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal" event={"ID":"0b49064290dc49dad3af00f5cc25e86d","Type":"ContainerStarted","Data":"5885d65ba70ad3243949762dbc6a1c926a3a18511a30de1cbd3f6f9a40ff7ea6"} Mar 18 16:43:53.724329 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.724306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6dkdp" event={"ID":"48146439-50bf-4967-ae3b-86c4c7ea0c9d","Type":"ContainerStarted","Data":"f515e60ad36006313c326580f7aebd02c675df442acbce3d043c77f2c100b41e"} Mar 18 16:43:53.725379 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.725348 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4khsw" event={"ID":"19928fae-37e8-4123-9e56-7cc4713544ee","Type":"ContainerStarted","Data":"c9f5a17cac7fcda10f606381c54780cb26cc36dcfe579369a9cf64b76eb7931f"} Mar 18 16:43:53.727074 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.726716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5nrz9" event={"ID":"8f0f214c-5e39-41f2-90e1-683e89ac4db2","Type":"ContainerStarted","Data":"2db3759ee1124f94719b30627b70ccb9b3958957628d12c1351a5a825679e07c"} Mar 18 16:43:53.728179 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.728153 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82l6k" event={"ID":"4bd66593-b926-4351-809d-710aff145026","Type":"ContainerStarted","Data":"29a4823d213191b02d054e58211d227ec410ba14c8fb51273f8da01d1c7d6dda"} Mar 18 16:43:53.729836 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:53.729809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s92ms" event={"ID":"b084c4bf-7986-4b9a-8c58-c374ef1fcf78","Type":"ContainerStarted","Data":"e08f2c1e7296bd869dddfedbf0f10c98d6dc60758309316bfdd6d1a160b0d2d9"} Mar 18 16:43:54.198718 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:54.198680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:54.198887 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:54.198842 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:54.198953 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:54.198919 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs podName:f9982fef-c82a-4b1f-8622-337551d7ec32 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:56.198888813 +0000 UTC m=+6.069512966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs") pod "network-metrics-daemon-97pm9" (UID: "f9982fef-c82a-4b1f-8622-337551d7ec32") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:54.400785 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:54.400743 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:43:54.400974 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:54.400938 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:54.400974 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:54.400960 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:54.400974 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:54.400973 2573 projected.go:194] Error preparing data for projected volume kube-api-access-fklth for pod openshift-network-diagnostics/network-check-target-l99p6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:54.401138 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:54.401034 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth podName:c2cd13a8-578c-4371-b5e8-7ef2af59364b nodeName:}" failed. No retries permitted until 2026-03-18 16:43:56.401014559 +0000 UTC m=+6.271638719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fklth" (UniqueName: "kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth") pod "network-check-target-l99p6" (UID: "c2cd13a8-578c-4371-b5e8-7ef2af59364b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:54.718056 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:54.717982 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:54.718519 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:54.718114 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:43:54.718576 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:54.718532 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:43:54.718946 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:54.718619 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:43:54.754866 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:54.754823 2573 generic.go:358] "Generic (PLEG): container finished" podID="bd2e7dc4e38c8db4728651f87a3ad871" containerID="89d53663ba8f5866432a4b10cd07fcb822244368f71d300ae52cd2e1724b900b" exitCode=0 Mar 18 16:43:54.755815 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:54.755788 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" event={"ID":"bd2e7dc4e38c8db4728651f87a3ad871","Type":"ContainerDied","Data":"89d53663ba8f5866432a4b10cd07fcb822244368f71d300ae52cd2e1724b900b"} Mar 18 16:43:54.770059 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:54.770002 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-173.ec2.internal" podStartSLOduration=3.769981773 podStartE2EDuration="3.769981773s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:43:53.73670285 +0000 UTC m=+3.607327038" watchObservedRunningTime="2026-03-18 16:43:54.769981773 +0000 UTC m=+4.640605950" Mar 18 16:43:55.762100 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:55.762053 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" event={"ID":"bd2e7dc4e38c8db4728651f87a3ad871","Type":"ContainerStarted","Data":"e10fa6ff576cb2ada415b75b5ebc36bee1d1d5f7a9d0f444482d0e264e75cec4"} Mar 18 16:43:56.217985 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:56.217946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:56.218166 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:56.218112 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:56.218240 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:56.218175 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs podName:f9982fef-c82a-4b1f-8622-337551d7ec32 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:00.218156844 +0000 UTC m=+10.088781022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs") pod "network-metrics-daemon-97pm9" (UID: "f9982fef-c82a-4b1f-8622-337551d7ec32") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:56.419419 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:56.419003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:43:56.419419 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:56.419157 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:56.419419 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:56.419178 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:56.419419 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:56.419190 2573 projected.go:194] Error preparing data for projected volume kube-api-access-fklth for pod openshift-network-diagnostics/network-check-target-l99p6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:56.419419 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:56.419255 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth podName:c2cd13a8-578c-4371-b5e8-7ef2af59364b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:00.419235098 +0000 UTC m=+10.289859479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fklth" (UniqueName: "kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth") pod "network-check-target-l99p6" (UID: "c2cd13a8-578c-4371-b5e8-7ef2af59364b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:56.710299 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:56.709710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:56.710299 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:56.709851 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:43:56.710299 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:56.709720 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:43:56.710299 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:56.709945 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:43:58.713605 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:58.713570 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:43:58.714058 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:58.713723 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:43:58.714058 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:43:58.713910 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:43:58.714058 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:43:58.713994 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:00.250976 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:00.250936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:00.251469 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:00.251118 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:00.251469 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:00.251189 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs podName:f9982fef-c82a-4b1f-8622-337551d7ec32 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:08.251167658 +0000 UTC m=+18.121791817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs") pod "network-metrics-daemon-97pm9" (UID: "f9982fef-c82a-4b1f-8622-337551d7ec32") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:00.452252 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:00.452207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:00.452464 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:00.452366 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:00.452464 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:00.452386 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:00.452464 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:00.452415 2573 projected.go:194] Error preparing data for projected volume kube-api-access-fklth for pod openshift-network-diagnostics/network-check-target-l99p6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:00.452627 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:00.452478 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth podName:c2cd13a8-578c-4371-b5e8-7ef2af59364b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:08.452458157 +0000 UTC m=+18.323082312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fklth" (UniqueName: "kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth") pod "network-check-target-l99p6" (UID: "c2cd13a8-578c-4371-b5e8-7ef2af59364b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:00.713727 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:00.713687 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:00.713945 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:00.713687 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:00.713945 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:00.713836 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:00.713945 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:00.713894 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:02.710604 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:02.710572 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:02.710604 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:02.710603 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:02.711023 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:02.710710 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:02.711023 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:02.710809 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:04.712770 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:04.712741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:04.713174 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:04.712745 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:04.713174 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:04.712849 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:04.713174 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:04.712896 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:06.709841 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:06.709756 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:06.709841 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:06.709785 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:06.710337 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:06.709903 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:06.710337 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:06.710038 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:08.316270 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:08.316232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:08.316847 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:08.316400 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:08.316847 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:08.316470 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs podName:f9982fef-c82a-4b1f-8622-337551d7ec32 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:24.316449341 +0000 UTC m=+34.187073511 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs") pod "network-metrics-daemon-97pm9" (UID: "f9982fef-c82a-4b1f-8622-337551d7ec32") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:08.517768 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:08.517722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:08.517944 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:08.517864 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:08.517944 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:08.517882 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:08.517944 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:08.517897 2573 projected.go:194] Error preparing data for projected volume kube-api-access-fklth for pod openshift-network-diagnostics/network-check-target-l99p6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:08.518078 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:08.517959 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth podName:c2cd13a8-578c-4371-b5e8-7ef2af59364b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:24.517943349 +0000 UTC m=+34.388567520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fklth" (UniqueName: "kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth") pod "network-check-target-l99p6" (UID: "c2cd13a8-578c-4371-b5e8-7ef2af59364b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:08.710016 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:08.709979 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:08.710197 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:08.710132 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:08.710264 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:08.710194 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:08.710318 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:08.710302 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:10.714167 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:10.712485 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:10.714167 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:10.712614 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:10.714167 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:10.712728 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:10.714167 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:10.712814 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:11.802882 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.802633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5nrz9" event={"ID":"8f0f214c-5e39-41f2-90e1-683e89ac4db2","Type":"ContainerStarted","Data":"915a34a80a5f1d970c0b59106ed504b514ef778b877395533e1f578eec5d4c66"} Mar 18 16:44:11.804425 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.804379 2573 generic.go:358] "Generic (PLEG): container finished" podID="4bd66593-b926-4351-809d-710aff145026" containerID="7d219edfbff254a295458d37f4c9d1577efa3fac2955aefa27bb0a26099608f1" exitCode=0 Mar 18 16:44:11.804549 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.804475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82l6k" event={"ID":"4bd66593-b926-4351-809d-710aff145026","Type":"ContainerDied","Data":"7d219edfbff254a295458d37f4c9d1577efa3fac2955aefa27bb0a26099608f1"} Mar 18 16:44:11.806411 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.806272 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s92ms" event={"ID":"b084c4bf-7986-4b9a-8c58-c374ef1fcf78","Type":"ContainerStarted","Data":"82508d18eba5900ca957ac5b71182d55531db3c06dbbb739068691f2830c779c"} Mar 18 16:44:11.809515 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.809490 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" event={"ID":"988c8f30-310e-4643-bf4c-f424b7d7c8ce","Type":"ContainerStarted","Data":"d61b87f0372a60c26b12aab1b45257dbec7d22f2e090320d69a20cc2454b8a2a"} Mar 18 16:44:11.809624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.809522 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" event={"ID":"988c8f30-310e-4643-bf4c-f424b7d7c8ce","Type":"ContainerStarted","Data":"146e37d24793441ec13c16d709fa4233a5a31e0290073f0b8ebc8d261e1b0f6d"} Mar 18 16:44:11.809624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.809537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" event={"ID":"988c8f30-310e-4643-bf4c-f424b7d7c8ce","Type":"ContainerStarted","Data":"1a7d0879a5568df47c2c34cc4e7649ba60a308afe41383e95a5a989208a7ca03"} Mar 18 16:44:11.809624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.809551 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" event={"ID":"988c8f30-310e-4643-bf4c-f424b7d7c8ce","Type":"ContainerStarted","Data":"ab76c849c262ebab9834eb38ffe67940f3b532e0f8d54998f9dcf84dcc283ed5"} Mar 18 16:44:11.809624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.809564 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" event={"ID":"988c8f30-310e-4643-bf4c-f424b7d7c8ce","Type":"ContainerStarted","Data":"4f39e98a855791b5bb0c938c89d6b943c36a2b26eb3ff07e4d240944642d1cbd"} Mar 18 16:44:11.809624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.809575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" event={"ID":"988c8f30-310e-4643-bf4c-f424b7d7c8ce","Type":"ContainerStarted","Data":"ba3e515509d705ab5782abae80bae1a261ad37c0c348798180c14b60db3f9d53"} Mar 18 16:44:11.811169 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.811143 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wgm8d" event={"ID":"6e6826d5-ecdc-4d2e-97c7-fbe508364d90","Type":"ContainerStarted","Data":"e044b087cec9060230a19dcc37862ad51bbce76b8c5d6e379bfe8379adc5f6f0"} Mar 18 16:44:11.812848 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.812823 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bl97m" event={"ID":"53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce","Type":"ContainerStarted","Data":"beb4dc3e412552c7276eb51de59fbaeaf9afff75db8d2f782215e43ad95c9e65"} Mar 18 16:44:11.814249 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.814227 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" event={"ID":"bcfba111-b183-4425-9a51-61c7760fa04b","Type":"ContainerStarted","Data":"c780e44686401c7278ade1f9f3ca608c5912d16c48dd7a4bd92394c3da2e36fa"} Mar 18 16:44:11.815515 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.815492 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4khsw" event={"ID":"19928fae-37e8-4123-9e56-7cc4713544ee","Type":"ContainerStarted","Data":"1a1ff314d551cedf8ec6c86d17ec58d3b8bc472dcecc17ed1321c9353ab7f95c"} Mar 18 16:44:11.819785 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.819750 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5nrz9" podStartSLOduration=3.561227249 podStartE2EDuration="20.819740241s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="2026-03-18 16:43:53.362065004 +0000 UTC m=+3.232689171" lastFinishedPulling="2026-03-18 16:44:10.620577995 +0000 UTC m=+20.491202163" observedRunningTime="2026-03-18 16:44:11.819598953 +0000 UTC m=+21.690223132" watchObservedRunningTime="2026-03-18 16:44:11.819740241 +0000 UTC m=+21.690364417" Mar 18 16:44:11.820087 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.820065 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-173.ec2.internal" podStartSLOduration=20.820059857 podStartE2EDuration="20.820059857s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:43:55.778093447 +0000 UTC m=+5.648717626" watchObservedRunningTime="2026-03-18 16:44:11.820059857 +0000 UTC m=+21.690684034" Mar 18 16:44:11.857482 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.857438 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wgm8d" podStartSLOduration=3.600723237 podStartE2EDuration="20.85742345s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="2026-03-18 16:43:53.363858301 +0000 UTC m=+3.234482460" lastFinishedPulling="2026-03-18 16:44:10.620558516 +0000 UTC m=+20.491182673" observedRunningTime="2026-03-18 16:44:11.856982567 +0000 UTC m=+21.727606742" watchObservedRunningTime="2026-03-18 16:44:11.85742345 +0000 UTC m=+21.728047664" Mar 18 16:44:11.879441 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.879379 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bl97m" podStartSLOduration=9.15037683 podStartE2EDuration="21.879363762s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:53.35981965 +0000 UTC m=+3.230443827" lastFinishedPulling="2026-03-18 16:44:06.088806601 +0000 UTC m=+15.959430759" observedRunningTime="2026-03-18 16:44:11.87907819 +0000 UTC m=+21.749702367" watchObservedRunningTime="2026-03-18 16:44:11.879363762 +0000 UTC m=+21.749987939" Mar 18 16:44:11.883192 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.883167 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:44:11.898762 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.898716 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4khsw" podStartSLOduration=4.6125872 podStartE2EDuration="21.898703385s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:53.368837277 +0000 UTC m=+3.239461437" lastFinishedPulling="2026-03-18 16:44:10.654953463 +0000 UTC m=+20.525577622" observedRunningTime="2026-03-18 16:44:11.898333229 +0000 UTC m=+21.768957405" watchObservedRunningTime="2026-03-18 16:44:11.898703385 +0000 UTC m=+21.769327560" Mar 18 16:44:11.917487 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:11.917278 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s92ms" podStartSLOduration=4.665961183 podStartE2EDuration="21.917030581s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:53.369251366 +0000 UTC m=+3.239875532" lastFinishedPulling="2026-03-18 16:44:10.620320773 +0000 UTC m=+20.490944930" observedRunningTime="2026-03-18 16:44:11.917040151 +0000 UTC m=+21.787664317" watchObservedRunningTime="2026-03-18 16:44:11.917030581 +0000 UTC m=+21.787654758" Mar 18 16:44:12.192328 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.192299 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:44:12.193051 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.193035 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:44:12.640311 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.640181 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:44:11.883187079Z","UUID":"b4e619a8-2818-47ea-ac32-54797b3fb8b7","Handler":null,"Name":"","Endpoint":""} Mar 18 16:44:12.643665 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.643639 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:44:12.643801 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.643677 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:44:12.709739 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.709715 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:12.709869 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:12.709850 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:12.710082 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.709715 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:12.710082 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:12.710048 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:12.819691 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.819656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" event={"ID":"bcfba111-b183-4425-9a51-61c7760fa04b","Type":"ContainerStarted","Data":"17a9f465b5a51398b6707902eb68121f8510e114e695b07bb3e82d423aaa6928"} Mar 18 16:44:12.821142 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.821113 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6dkdp" event={"ID":"48146439-50bf-4967-ae3b-86c4c7ea0c9d","Type":"ContainerStarted","Data":"2e59abfcbb6e1cfb8514b7b75868535e3b23899dd72f749d64e2463e75c876c1"} Mar 18 16:44:12.822118 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.822050 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:44:12.822650 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.822618 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bl97m" Mar 18 16:44:12.835184 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:12.835137 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-6dkdp" podStartSLOduration=4.584741155 podStartE2EDuration="21.835119133s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="2026-03-18 16:43:53.369944377 +0000 UTC m=+3.240568546" lastFinishedPulling="2026-03-18 16:44:10.620322364 +0000 UTC m=+20.490946524" observedRunningTime="2026-03-18 16:44:12.834736115 +0000 UTC m=+22.705360295" watchObservedRunningTime="2026-03-18 16:44:12.835119133 +0000 UTC m=+22.705743310" Mar 18 16:44:13.826165 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:13.826122 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" event={"ID":"988c8f30-310e-4643-bf4c-f424b7d7c8ce","Type":"ContainerStarted","Data":"97fa4e76bc030c600b32e4c21f2ec3e9442b660ea7fca9ad08362d02f4132e3b"} Mar 18 16:44:13.828246 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:13.828193 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" event={"ID":"bcfba111-b183-4425-9a51-61c7760fa04b","Type":"ContainerStarted","Data":"11f54aa7e2d453065362182a66a3f0cc0dde25d4a918113000c5e6caef21f42f"} Mar 18 16:44:13.845088 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:13.844986 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5cjnk" podStartSLOduration=4.505640613 podStartE2EDuration="23.844969786s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:53.358428645 +0000 UTC m=+3.229052808" lastFinishedPulling="2026-03-18 16:44:12.697757813 +0000 UTC m=+22.568381981" observedRunningTime="2026-03-18 16:44:13.844023794 +0000 UTC m=+23.714647971" watchObservedRunningTime="2026-03-18 16:44:13.844969786 +0000 UTC m=+23.715593961" Mar 18 16:44:14.710080 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:14.709826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:14.710249 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:14.709826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:14.710249 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:14.710180 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:14.710345 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:14.710279 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:16.709990 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:16.709952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:16.709990 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:16.709984 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:16.710737 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:16.710081 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:16.710737 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:16.710224 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:16.836175 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:16.836136 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" event={"ID":"988c8f30-310e-4643-bf4c-f424b7d7c8ce","Type":"ContainerStarted","Data":"fce44dda999792461d7c0d1a03d00870b14fa3699fa029ad588837422f147873"} Mar 18 16:44:16.836495 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:16.836466 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:44:16.837835 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:16.837811 2573 generic.go:358] "Generic (PLEG): container finished" podID="4bd66593-b926-4351-809d-710aff145026" containerID="a943bb5403e81bcf1bdc447d0eb7765c6455370743dadb96a020baf865561fe0" exitCode=0 Mar 18 16:44:16.837964 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:16.837844 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82l6k" event={"ID":"4bd66593-b926-4351-809d-710aff145026","Type":"ContainerDied","Data":"a943bb5403e81bcf1bdc447d0eb7765c6455370743dadb96a020baf865561fe0"} Mar 18 16:44:16.851888 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:16.851864 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:44:16.863032 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:16.862990 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" podStartSLOduration=8.229906368 podStartE2EDuration="25.862977326s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="2026-03-18 16:43:53.366956846 +0000 UTC m=+3.237581014" lastFinishedPulling="2026-03-18 16:44:11.000027784 +0000 UTC m=+20.870651972" observedRunningTime="2026-03-18 16:44:16.862530233 +0000 UTC m=+26.733154409" watchObservedRunningTime="2026-03-18 16:44:16.862977326 +0000 UTC m=+26.733601502" Mar 18 16:44:17.841725 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:17.841462 2573 generic.go:358] "Generic (PLEG): container finished" podID="4bd66593-b926-4351-809d-710aff145026" containerID="96d12219c0b91e069edcc7046b55a7d9a5bb62ec2cf2bddee6d27ca937dacab8" exitCode=0 Mar 18 16:44:17.842167 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:17.841551 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82l6k" event={"ID":"4bd66593-b926-4351-809d-710aff145026","Type":"ContainerDied","Data":"96d12219c0b91e069edcc7046b55a7d9a5bb62ec2cf2bddee6d27ca937dacab8"} Mar 18 16:44:17.842918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:17.842336 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:44:17.842918 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:17.842359 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:44:17.856588 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:17.856567 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:44:18.710011 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:18.709971 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:18.710187 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:18.710018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:18.710187 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:18.710103 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:18.710301 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:18.710210 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:18.845926 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:18.845896 2573 generic.go:358] "Generic (PLEG): container finished" podID="4bd66593-b926-4351-809d-710aff145026" containerID="30877bb2496481857a1f269fe7d896207c34cf87fba7beabf53d9acaa2a8f796" exitCode=0 Mar 18 16:44:18.846303 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:18.845972 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82l6k" event={"ID":"4bd66593-b926-4351-809d-710aff145026","Type":"ContainerDied","Data":"30877bb2496481857a1f269fe7d896207c34cf87fba7beabf53d9acaa2a8f796"} Mar 18 16:44:20.711341 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:20.711306 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:20.711842 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:20.711447 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:20.711842 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:20.711533 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:20.711842 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:20.711670 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:22.709829 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:22.709788 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:22.710267 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:22.709803 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:22.710267 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:22.709897 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:22.710267 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:22.710055 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:24.330131 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:24.330095 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:24.330665 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:24.330243 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:24.330665 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:24.330311 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs podName:f9982fef-c82a-4b1f-8622-337551d7ec32 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:56.330294901 +0000 UTC m=+66.200919056 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs") pod "network-metrics-daemon-97pm9" (UID: "f9982fef-c82a-4b1f-8622-337551d7ec32") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:24.532220 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:24.532180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:24.532416 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:24.532329 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:24.532416 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:24.532353 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:24.532416 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:24.532366 2573 projected.go:194] Error preparing data for projected volume kube-api-access-fklth for pod openshift-network-diagnostics/network-check-target-l99p6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:24.532562 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:24.532449 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth podName:c2cd13a8-578c-4371-b5e8-7ef2af59364b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:56.532431449 +0000 UTC m=+66.403055605 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fklth" (UniqueName: "kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth") pod "network-check-target-l99p6" (UID: "c2cd13a8-578c-4371-b5e8-7ef2af59364b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:24.710664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:24.710472 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:24.710664 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:24.710476 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:24.710664 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:24.710626 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:24.710933 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:24.710685 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:26.710675 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:26.710583 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:26.711138 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:26.710583 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:26.711138 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:26.710685 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:26.711138 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:26.710763 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:26.863087 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:26.863048 2573 generic.go:358] "Generic (PLEG): container finished" podID="4bd66593-b926-4351-809d-710aff145026" containerID="d40e60a854ec1497a915919eff86cb0d41bcda79cde6e7c7b553e2f595b9057d" exitCode=0 Mar 18 16:44:26.863238 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:26.863115 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82l6k" event={"ID":"4bd66593-b926-4351-809d-710aff145026","Type":"ContainerDied","Data":"d40e60a854ec1497a915919eff86cb0d41bcda79cde6e7c7b553e2f595b9057d"} Mar 18 16:44:27.867749 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:27.867715 2573 generic.go:358] "Generic (PLEG): container finished" podID="4bd66593-b926-4351-809d-710aff145026" containerID="4b369ee63ba6e641eb960ab58e54eece9ab718fd7da1402c8cf473658465d81f" exitCode=0 Mar 18 16:44:27.868148 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:27.867779 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82l6k" event={"ID":"4bd66593-b926-4351-809d-710aff145026","Type":"ContainerDied","Data":"4b369ee63ba6e641eb960ab58e54eece9ab718fd7da1402c8cf473658465d81f"} Mar 18 16:44:28.709631 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:28.709594 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:28.709791 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:28.709723 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:28.709791 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:28.709763 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:28.709863 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:28.709849 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:28.872633 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:28.872598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-82l6k" event={"ID":"4bd66593-b926-4351-809d-710aff145026","Type":"ContainerStarted","Data":"9e2f640bfda1566adcda63f0a086343b5425e3d00472866578fc5bbd97d09f22"} Mar 18 16:44:28.894790 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:28.894740 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-82l6k" podStartSLOduration=5.173608644 podStartE2EDuration="37.894727272s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="2026-03-18 16:43:53.360582289 +0000 UTC m=+3.231206457" lastFinishedPulling="2026-03-18 16:44:26.081700913 +0000 UTC m=+35.952325085" observedRunningTime="2026-03-18 16:44:28.894311776 +0000 UTC m=+38.764935955" watchObservedRunningTime="2026-03-18 16:44:28.894727272 +0000 UTC m=+38.765351447" Mar 18 16:44:30.710686 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:30.710652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:30.711142 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:30.710735 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:30.711142 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:30.710831 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:30.711142 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:30.710968 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:32.710530 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:32.710349 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:32.710819 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:32.710418 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:32.710819 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:32.710629 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:32.710819 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:32.710699 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:32.816091 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:32.816061 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l99p6"] Mar 18 16:44:32.818848 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:32.818822 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-97pm9"] Mar 18 16:44:32.880201 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:32.880132 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:32.880201 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:32.880174 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:32.880421 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:32.880254 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:32.880421 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:32.880316 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:34.710294 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:34.710257 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:34.710766 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:34.710298 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:34.710766 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:34.710379 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:34.710766 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:34.710500 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:36.709579 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:36.709545 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:36.710175 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:36.709546 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:36.710175 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:36.709665 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l99p6" podUID="c2cd13a8-578c-4371-b5e8-7ef2af59364b" Mar 18 16:44:36.710175 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:36.709715 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-97pm9" podUID="f9982fef-c82a-4b1f-8622-337551d7ec32" Mar 18 16:44:38.001978 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.001947 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-173.ec2.internal" event="NodeReady" Mar 18 16:44:38.002458 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.002069 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:44:38.074112 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.074079 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-94z6k"] Mar 18 16:44:38.094903 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.094859 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8bzfj"] Mar 18 16:44:38.095085 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.095044 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.106255 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.106225 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:44:38.106386 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.106225 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5f2hh\"" Mar 18 16:44:38.106386 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.106225 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.106386 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.106348 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:44:38.106716 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.106704 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.112680 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.112656 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8bzfj"] Mar 18 16:44:38.112798 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.112774 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.121190 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.121167 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-94z6k"] Mar 18 16:44:38.121943 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.121928 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:44:38.122573 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.122555 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7bwt6\"" Mar 18 16:44:38.123646 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.123631 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:44:38.170797 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.170769 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6wv4w"] Mar 18 16:44:38.184934 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.184904 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6wv4w" Mar 18 16:44:38.185110 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.185028 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6wv4w"] Mar 18 16:44:38.188202 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.188178 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.188344 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.188219 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xkmdp\"" Mar 18 16:44:38.188545 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.188526 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.188749 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.188731 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:44:38.229568 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.229533 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmfl5\" (UniqueName: \"kubernetes.io/projected/7290ec47-5651-45a6-b07b-cea131daf413-kube-api-access-fmfl5\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.229568 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.229573 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-crio-socket\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.229799 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.229600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7290ec47-5651-45a6-b07b-cea131daf413-config-volume\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.229799 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.229654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7290ec47-5651-45a6-b07b-cea131daf413-tmp-dir\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.229799 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.229712 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfppb\" (UniqueName: \"kubernetes.io/projected/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-kube-api-access-cfppb\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.229799 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.229756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7290ec47-5651-45a6-b07b-cea131daf413-metrics-tls\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.229799 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.229773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-data-volume\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.229799 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.229789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.229975 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.229806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.330799 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.330765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7290ec47-5651-45a6-b07b-cea131daf413-metrics-tls\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.330973 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.330807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-data-volume\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.330973 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.330831 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.330973 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.330861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7zs4\" (UniqueName: \"kubernetes.io/projected/459ce05a-5191-4973-af7e-9b892245fcdc-kube-api-access-n7zs4\") pod \"ingress-canary-6wv4w\" (UID: \"459ce05a-5191-4973-af7e-9b892245fcdc\") " pod="openshift-ingress-canary/ingress-canary-6wv4w" Mar 18 16:44:38.330973 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.330894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.331175 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.330969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmfl5\" (UniqueName: \"kubernetes.io/projected/7290ec47-5651-45a6-b07b-cea131daf413-kube-api-access-fmfl5\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.331175 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-crio-socket\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.331175 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/459ce05a-5191-4973-af7e-9b892245fcdc-cert\") pod \"ingress-canary-6wv4w\" (UID: \"459ce05a-5191-4973-af7e-9b892245fcdc\") " pod="openshift-ingress-canary/ingress-canary-6wv4w" Mar 18 16:44:38.331175 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7290ec47-5651-45a6-b07b-cea131daf413-config-volume\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.331175 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7290ec47-5651-45a6-b07b-cea131daf413-tmp-dir\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.331175 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfppb\" (UniqueName: \"kubernetes.io/projected/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-kube-api-access-cfppb\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.331464 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331199 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-data-volume\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.331464 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331263 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-crio-socket\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.331552 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.331597 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331578 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7290ec47-5651-45a6-b07b-cea131daf413-tmp-dir\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.331678 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.331664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7290ec47-5651-45a6-b07b-cea131daf413-config-volume\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.335910 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.335885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.336021 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.335885 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7290ec47-5651-45a6-b07b-cea131daf413-metrics-tls\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.339609 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.339587 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfppb\" (UniqueName: \"kubernetes.io/projected/8adbd2f9-bd9e-49c8-b447-5e8d4825c56b-kube-api-access-cfppb\") pod \"insights-runtime-extractor-94z6k\" (UID: \"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b\") " pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.339840 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.339822 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmfl5\" (UniqueName: \"kubernetes.io/projected/7290ec47-5651-45a6-b07b-cea131daf413-kube-api-access-fmfl5\") pod \"dns-default-8bzfj\" (UID: \"7290ec47-5651-45a6-b07b-cea131daf413\") " pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.408271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.408233 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-94z6k" Mar 18 16:44:38.421475 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.421445 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:38.432369 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.432337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7zs4\" (UniqueName: \"kubernetes.io/projected/459ce05a-5191-4973-af7e-9b892245fcdc-kube-api-access-n7zs4\") pod \"ingress-canary-6wv4w\" (UID: \"459ce05a-5191-4973-af7e-9b892245fcdc\") " pod="openshift-ingress-canary/ingress-canary-6wv4w" Mar 18 16:44:38.432526 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.432424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/459ce05a-5191-4973-af7e-9b892245fcdc-cert\") pod \"ingress-canary-6wv4w\" (UID: \"459ce05a-5191-4973-af7e-9b892245fcdc\") " pod="openshift-ingress-canary/ingress-canary-6wv4w" Mar 18 16:44:38.434787 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.434763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/459ce05a-5191-4973-af7e-9b892245fcdc-cert\") pod \"ingress-canary-6wv4w\" (UID: \"459ce05a-5191-4973-af7e-9b892245fcdc\") " pod="openshift-ingress-canary/ingress-canary-6wv4w" Mar 18 16:44:38.440323 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.440292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7zs4\" (UniqueName: \"kubernetes.io/projected/459ce05a-5191-4973-af7e-9b892245fcdc-kube-api-access-n7zs4\") pod \"ingress-canary-6wv4w\" (UID: \"459ce05a-5191-4973-af7e-9b892245fcdc\") " pod="openshift-ingress-canary/ingress-canary-6wv4w" Mar 18 16:44:38.494841 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.494131 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6wv4w" Mar 18 16:44:38.570472 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.570443 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-94z6k"] Mar 18 16:44:38.573275 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.573242 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8bzfj"] Mar 18 16:44:38.577635 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:38.577608 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7290ec47_5651_45a6_b07b_cea131daf413.slice/crio-018ca6cd8b67ddce65eaa24df5caf7daa37678c6da7b04c68333b4615725bae8 WatchSource:0}: Error finding container 018ca6cd8b67ddce65eaa24df5caf7daa37678c6da7b04c68333b4615725bae8: Status 404 returned error can't find the container with id 018ca6cd8b67ddce65eaa24df5caf7daa37678c6da7b04c68333b4615725bae8 Mar 18 16:44:38.619787 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.619762 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6wv4w"] Mar 18 16:44:38.626407 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:38.626363 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod459ce05a_5191_4973_af7e_9b892245fcdc.slice/crio-4b2f4f168ae8bd81513e24d75138421e21ad021d62120830a54fb116c8420e35 WatchSource:0}: Error finding container 4b2f4f168ae8bd81513e24d75138421e21ad021d62120830a54fb116c8420e35: Status 404 returned error can't find the container with id 4b2f4f168ae8bd81513e24d75138421e21ad021d62120830a54fb116c8420e35 Mar 18 16:44:38.709715 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.709685 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:38.709930 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.709909 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:38.712220 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.712198 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:44:38.712329 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.712313 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:44:38.712445 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.712212 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nk2x6\"" Mar 18 16:44:38.712445 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.712262 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqk66\"" Mar 18 16:44:38.712746 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.712268 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:44:38.892219 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.892124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6wv4w" event={"ID":"459ce05a-5191-4973-af7e-9b892245fcdc","Type":"ContainerStarted","Data":"4b2f4f168ae8bd81513e24d75138421e21ad021d62120830a54fb116c8420e35"} Mar 18 16:44:38.893444 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.893421 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-94z6k" event={"ID":"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b","Type":"ContainerStarted","Data":"f6aadb35c54181cb707328aaec6ffaf506fc062369ca7d5a7b6f47b4d13ed451"} Mar 18 16:44:38.893582 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.893451 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-94z6k" event={"ID":"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b","Type":"ContainerStarted","Data":"cc6d0ff3d85bedbe80b6f628e47c700c5f58c7ca81c0f5cf9a8d1b6663be5af2"} Mar 18 16:44:38.894330 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:38.894311 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8bzfj" event={"ID":"7290ec47-5651-45a6-b07b-cea131daf413","Type":"ContainerStarted","Data":"018ca6cd8b67ddce65eaa24df5caf7daa37678c6da7b04c68333b4615725bae8"} Mar 18 16:44:41.904638 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:41.904599 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6wv4w" event={"ID":"459ce05a-5191-4973-af7e-9b892245fcdc","Type":"ContainerStarted","Data":"83b405ddeaa279d3ca37158a1ed557319c7398a44b0e4b7c377f68adb2e80fc0"} Mar 18 16:44:41.906041 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:41.906017 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-94z6k" event={"ID":"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b","Type":"ContainerStarted","Data":"b353666390e21a12a4b8c47abee2111299e44608a5ba1ef76a34e1fe4be540ae"} Mar 18 16:44:41.907348 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:41.907326 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8bzfj" event={"ID":"7290ec47-5651-45a6-b07b-cea131daf413","Type":"ContainerStarted","Data":"265ef2c3091d1a3daa2c7d703b7f23d53670eeb681c03d9871802a72f7810312"} Mar 18 16:44:41.907452 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:41.907353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8bzfj" event={"ID":"7290ec47-5651-45a6-b07b-cea131daf413","Type":"ContainerStarted","Data":"0ad36f4c96635e55b00b32d56ed772920f0726a941afae51c118cb8d18dec784"} Mar 18 16:44:41.907539 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:41.907523 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:41.972575 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:41.972506 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6wv4w" podStartSLOduration=1.356945114 podStartE2EDuration="3.97249035s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:38.628094681 +0000 UTC m=+48.498718835" lastFinishedPulling="2026-03-18 16:44:41.243639914 +0000 UTC m=+51.114264071" observedRunningTime="2026-03-18 16:44:41.946587832 +0000 UTC m=+51.817212010" watchObservedRunningTime="2026-03-18 16:44:41.97249035 +0000 UTC m=+51.843114527" Mar 18 16:44:41.974370 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:41.974315 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8bzfj" podStartSLOduration=1.3145351459999999 podStartE2EDuration="3.974291417s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:38.57962884 +0000 UTC m=+48.450253008" lastFinishedPulling="2026-03-18 16:44:41.239385109 +0000 UTC m=+51.110009279" observedRunningTime="2026-03-18 16:44:41.972319179 +0000 UTC m=+51.842943346" watchObservedRunningTime="2026-03-18 16:44:41.974291417 +0000 UTC m=+51.844915595" Mar 18 16:44:42.913410 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:42.913355 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-94z6k" event={"ID":"8adbd2f9-bd9e-49c8-b447-5e8d4825c56b","Type":"ContainerStarted","Data":"bfe4c04b21dcaff6bf36f7aa4ceda273578ed3a8ed34fe050a3bbe7ca762f52b"} Mar 18 16:44:42.939034 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:42.938988 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-94z6k" podStartSLOduration=0.850879994 podStartE2EDuration="4.938972827s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:38.731568499 +0000 UTC m=+48.602192653" lastFinishedPulling="2026-03-18 16:44:42.819661327 +0000 UTC m=+52.690285486" observedRunningTime="2026-03-18 16:44:42.937912854 +0000 UTC m=+52.808537065" watchObservedRunningTime="2026-03-18 16:44:42.938972827 +0000 UTC m=+52.809597075" Mar 18 16:44:43.094591 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.094557 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r"] Mar 18 16:44:43.097536 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.097519 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.100598 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.100575 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Mar 18 16:44:43.100719 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.100629 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Mar 18 16:44:43.100719 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.100660 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-lvvd4\"" Mar 18 16:44:43.100992 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.100974 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:44:43.101206 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.101187 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:44:43.101295 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.101189 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:44:43.101995 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.101976 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj"] Mar 18 16:44:43.104790 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.104768 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.107660 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.107640 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Mar 18 16:44:43.107838 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.107816 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-kv4tb\"" Mar 18 16:44:43.108208 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.108188 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Mar 18 16:44:43.108514 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.108498 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Mar 18 16:44:43.108709 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.108689 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r"] Mar 18 16:44:43.122446 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.122417 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj"] Mar 18 16:44:43.132070 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.132037 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5jfgb"] Mar 18 16:44:43.135015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.134994 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.137042 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.137021 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:44:43.137147 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.137127 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-t8t9n\"" Mar 18 16:44:43.137357 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.137343 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:44:43.137822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.137806 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:44:43.168785 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.168751 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htrs\" (UniqueName: \"kubernetes.io/projected/8b4e9753-8c03-4237-855f-f207bb536cc9-kube-api-access-9htrs\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.169006 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.168986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-accelerators-collector-config\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.169130 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.169117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-tls\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.169225 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.169213 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.169334 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.169321 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.169569 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.169527 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.169685 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.169652 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2lqk\" (UniqueName: \"kubernetes.io/projected/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-api-access-j2lqk\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.170142 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.169697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.170485 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1c58dd1-8573-4833-a0b4-fc571c8853cd-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.170537 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-textfile\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.170537 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-wtmp\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.170605 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170581 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdtz\" (UniqueName: \"kubernetes.io/projected/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-kube-api-access-hrdtz\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.170655 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170640 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b4e9753-8c03-4237-855f-f207bb536cc9-metrics-client-ca\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.170710 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170699 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f1c58dd1-8573-4833-a0b4-fc571c8853cd-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.170746 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b4e9753-8c03-4237-855f-f207bb536cc9-sys\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.170778 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170748 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.170812 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170778 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8b4e9753-8c03-4237-855f-f207bb536cc9-root\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.170812 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170807 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.170874 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.170861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.271165 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271127 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9htrs\" (UniqueName: \"kubernetes.io/projected/8b4e9753-8c03-4237-855f-f207bb536cc9-kube-api-access-9htrs\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.271333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-accelerators-collector-config\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.271333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-tls\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.271333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.271333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.271333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.271333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271326 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2lqk\" (UniqueName: \"kubernetes.io/projected/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-api-access-j2lqk\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271369 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:43.271373 2573 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1c58dd1-8573-4833-a0b4-fc571c8853cd-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-textfile\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:43.271456 2573 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:43.271466 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-tls podName:f1c58dd1-8573-4833-a0b4-fc571c8853cd nodeName:}" failed. No retries permitted until 2026-03-18 16:44:43.771444644 +0000 UTC m=+53.642068799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-tls") pod "kube-state-metrics-6df7999c47-8pcdj" (UID: "f1c58dd1-8573-4833-a0b4-fc571c8853cd") : secret "kube-state-metrics-tls" not found Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:43.271557 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-openshift-state-metrics-tls podName:0121a6ed-4a8e-408b-a0c5-f8ed07b6656c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:43.771537938 +0000 UTC m=+53.642162107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-openshift-state-metrics-tls") pod "openshift-state-metrics-68b5d5d464-zhn6r" (UID: "0121a6ed-4a8e-408b-a0c5-f8ed07b6656c") : secret "openshift-state-metrics-tls" not found Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-wtmp\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdtz\" (UniqueName: \"kubernetes.io/projected/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-kube-api-access-hrdtz\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.271652 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271652 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b4e9753-8c03-4237-855f-f207bb536cc9-metrics-client-ca\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271681 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f1c58dd1-8573-4833-a0b4-fc571c8853cd-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.272119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b4e9753-8c03-4237-855f-f207bb536cc9-sys\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271753 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.272119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8b4e9753-8c03-4237-855f-f207bb536cc9-root\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271787 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-wtmp\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.272119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271843 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b4e9753-8c03-4237-855f-f207bb536cc9-sys\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8b4e9753-8c03-4237-855f-f207bb536cc9-root\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272543 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.272199 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1c58dd1-8573-4833-a0b4-fc571c8853cd-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.272543 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.272234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f1c58dd1-8573-4833-a0b4-fc571c8853cd-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.272543 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.271844 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272543 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.272336 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.272543 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.272339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8b4e9753-8c03-4237-855f-f207bb536cc9-metrics-client-ca\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272746 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.272583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-textfile\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272746 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.272618 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-accelerators-collector-config\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.272746 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.272705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.275735 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.275709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-tls\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.275841 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.275766 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8b4e9753-8c03-4237-855f-f207bb536cc9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.275841 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.275800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.275926 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.275866 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.282667 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.282646 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdtz\" (UniqueName: \"kubernetes.io/projected/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-kube-api-access-hrdtz\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.282796 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.282775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2lqk\" (UniqueName: \"kubernetes.io/projected/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-api-access-j2lqk\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.282986 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.282966 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htrs\" (UniqueName: \"kubernetes.io/projected/8b4e9753-8c03-4237-855f-f207bb536cc9-kube-api-access-9htrs\") pod \"node-exporter-5jfgb\" (UID: \"8b4e9753-8c03-4237-855f-f207bb536cc9\") " pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.444879 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.444770 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5jfgb" Mar 18 16:44:43.453233 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:43.453187 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b4e9753_8c03_4237_855f_f207bb536cc9.slice/crio-2f8729581bc02b92c4de80f95477b441567e1c23e668e00f9375610f1a76b151 WatchSource:0}: Error finding container 2f8729581bc02b92c4de80f95477b441567e1c23e668e00f9375610f1a76b151: Status 404 returned error can't find the container with id 2f8729581bc02b92c4de80f95477b441567e1c23e668e00f9375610f1a76b151 Mar 18 16:44:43.777226 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.777127 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.777226 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.777176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.779646 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.779612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0121a6ed-4a8e-408b-a0c5-f8ed07b6656c-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-zhn6r\" (UID: \"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:43.779646 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.779612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1c58dd1-8573-4833-a0b4-fc571c8853cd-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-8pcdj\" (UID: \"f1c58dd1-8573-4833-a0b4-fc571c8853cd\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:43.917596 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:43.917548 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5jfgb" event={"ID":"8b4e9753-8c03-4237-855f-f207bb536cc9","Type":"ContainerStarted","Data":"2f8729581bc02b92c4de80f95477b441567e1c23e668e00f9375610f1a76b151"} Mar 18 16:44:44.007589 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.007549 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" Mar 18 16:44:44.014993 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.014972 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" Mar 18 16:44:44.215603 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.214490 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:44:44.221969 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.221851 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.224672 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.224646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Mar 18 16:44:44.224782 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.224698 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Mar 18 16:44:44.224835 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.224796 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Mar 18 16:44:44.224954 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.224937 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Mar 18 16:44:44.225218 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.224962 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Mar 18 16:44:44.225218 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.224986 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Mar 18 16:44:44.225889 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.225467 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Mar 18 16:44:44.225889 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.225610 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Mar 18 16:44:44.225889 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.225689 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Mar 18 16:44:44.225889 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.225788 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-bmkrg\"" Mar 18 16:44:44.232339 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.232296 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:44:44.283775 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.283742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.283775 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.283781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-tls-assets\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.283995 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.283800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.283995 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.283864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.283995 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.283900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxg66\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-kube-api-access-hxg66\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.283995 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.283938 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.284148 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.284000 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-web-config\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.284148 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.284026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.284148 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.284042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.284148 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.284063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-config-out\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.284148 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.284078 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.284148 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.284138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.284333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.284171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-config-volume\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.384655 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.384578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-web-config\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.384655 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.384616 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.384655 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.384634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.384867 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.384725 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-config-out\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.384867 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.384741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.384867 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.384763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.384867 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.384790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-config-volume\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.385044 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.385010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.385102 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.385088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-tls-assets\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.385151 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.385117 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.385151 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.385146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.385253 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.385168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxg66\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-kube-api-access-hxg66\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.385253 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.385205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.385253 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.385228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.385606 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.385546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.385741 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:44:44.385656 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-trusted-ca-bundle podName:218f2a4a-e36b-43db-996a-9a881d2a1117 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:44.885634844 +0000 UTC m=+54.756259018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117") : configmap references non-existent config key: ca-bundle.crt Mar 18 16:44:44.387771 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.387726 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-config-out\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.387973 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.387954 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.387973 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.387967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.388511 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.388487 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-tls-assets\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.388612 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.388543 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-config-volume\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.388612 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.388579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.388772 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.388755 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-web-config\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.388874 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.388857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.389309 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.389295 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.393300 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.393279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxg66\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-kube-api-access-hxg66\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.461545 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.461513 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj"] Mar 18 16:44:44.464584 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:44.464554 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1c58dd1_8573_4833_a0b4_fc571c8853cd.slice/crio-6040ce4a852905aaf5adc2f4c92c29610c124dc2b9fc882917c1210d1c94d622 WatchSource:0}: Error finding container 6040ce4a852905aaf5adc2f4c92c29610c124dc2b9fc882917c1210d1c94d622: Status 404 returned error can't find the container with id 6040ce4a852905aaf5adc2f4c92c29610c124dc2b9fc882917c1210d1c94d622 Mar 18 16:44:44.465596 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.465560 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r"] Mar 18 16:44:44.469492 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:44.469464 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0121a6ed_4a8e_408b_a0c5_f8ed07b6656c.slice/crio-de28d2109e8945f4a75da859f6e9c168b623bfc967ddd89014a4a83acaa5424e WatchSource:0}: Error finding container de28d2109e8945f4a75da859f6e9c168b623bfc967ddd89014a4a83acaa5424e: Status 404 returned error can't find the container with id de28d2109e8945f4a75da859f6e9c168b623bfc967ddd89014a4a83acaa5424e Mar 18 16:44:44.703907 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.703824 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b6f5fd49b-mcp9n"] Mar 18 16:44:44.706828 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.706812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.708880 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.708858 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 18 16:44:44.709367 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.709334 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 18 16:44:44.709492 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.709378 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 18 16:44:44.709492 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.709383 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 18 16:44:44.709583 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.709538 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 18 16:44:44.709800 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.709780 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 18 16:44:44.709893 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.709817 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 18 16:44:44.710064 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.710048 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-l5zrn\"" Mar 18 16:44:44.714765 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.714748 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 18 16:44:44.718133 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.718116 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b6f5fd49b-mcp9n"] Mar 18 16:44:44.788911 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.788874 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-trusted-ca-bundle\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.788911 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.788910 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-oauth-config\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.789120 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.789007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-service-ca\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.789120 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.789037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-config\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.789120 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.789061 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-oauth-serving-cert\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.789120 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.789077 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9kw4\" (UniqueName: \"kubernetes.io/projected/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-kube-api-access-j9kw4\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.789237 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.789157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-serving-cert\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.890328 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.890274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-trusted-ca-bundle\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.890328 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.890329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-oauth-config\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.890604 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.890531 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-service-ca\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.890604 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.890562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-config\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.890604 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.890585 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-oauth-serving-cert\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.890604 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.890601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9kw4\" (UniqueName: \"kubernetes.io/projected/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-kube-api-access-j9kw4\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.890799 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.890633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.890799 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.890690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-serving-cert\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.891293 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.891264 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-trusted-ca-bundle\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.891425 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.891309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-config\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.891425 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.891309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-oauth-serving-cert\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.891550 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.891509 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:44.891642 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.891624 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-service-ca\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.892886 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.892864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-oauth-config\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.893356 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.893337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-serving-cert\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.898492 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.898467 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9kw4\" (UniqueName: \"kubernetes.io/projected/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-kube-api-access-j9kw4\") pod \"console-b6f5fd49b-mcp9n\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:44.921030 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.920996 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" event={"ID":"f1c58dd1-8573-4833-a0b4-fc571c8853cd","Type":"ContainerStarted","Data":"6040ce4a852905aaf5adc2f4c92c29610c124dc2b9fc882917c1210d1c94d622"} Mar 18 16:44:44.922305 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.922277 2573 generic.go:358] "Generic (PLEG): container finished" podID="8b4e9753-8c03-4237-855f-f207bb536cc9" containerID="8ff8001c051a7945fc9feee1efcd8bf2eafad0dfafc44401c03e28afd21b030c" exitCode=0 Mar 18 16:44:44.922432 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.922355 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5jfgb" event={"ID":"8b4e9753-8c03-4237-855f-f207bb536cc9","Type":"ContainerDied","Data":"8ff8001c051a7945fc9feee1efcd8bf2eafad0dfafc44401c03e28afd21b030c"} Mar 18 16:44:44.923937 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.923916 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" event={"ID":"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c","Type":"ContainerStarted","Data":"9f08b12e2b4430f3b33f9df0a1d6dd4556d6b6c9cf502711fa6629a3966fce0f"} Mar 18 16:44:44.924023 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.923942 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" event={"ID":"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c","Type":"ContainerStarted","Data":"63c8ba5fc90a7fd8d20855243214f356e1528ed2e2bef37d1ed563d783115670"} Mar 18 16:44:44.924023 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:44.923952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" event={"ID":"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c","Type":"ContainerStarted","Data":"de28d2109e8945f4a75da859f6e9c168b623bfc967ddd89014a4a83acaa5424e"} Mar 18 16:44:45.016616 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.016592 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:45.134361 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.134322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:44:45.166766 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.166676 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b6f5fd49b-mcp9n"] Mar 18 16:44:45.171211 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:45.171169 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeba9ba07_b5db_4044_a877_b7ec62dc1ef7.slice/crio-8171624d57c27ba2e41bbb5b07cb82344f48f13db7dc098639933106935f691b WatchSource:0}: Error finding container 8171624d57c27ba2e41bbb5b07cb82344f48f13db7dc098639933106935f691b: Status 404 returned error can't find the container with id 8171624d57c27ba2e41bbb5b07cb82344f48f13db7dc098639933106935f691b Mar 18 16:44:45.208916 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.208847 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-548bb99f76-vtpmv"] Mar 18 16:44:45.215232 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.214943 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.217767 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.217745 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Mar 18 16:44:45.217941 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.217922 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-va333ahoofva\"" Mar 18 16:44:45.218132 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.217767 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-q2zjr\"" Mar 18 16:44:45.218298 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.217884 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Mar 18 16:44:45.218449 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.218137 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Mar 18 16:44:45.218692 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.217842 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Mar 18 16:44:45.218692 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.218188 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Mar 18 16:44:45.228349 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.225943 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-548bb99f76-vtpmv"] Mar 18 16:44:45.291990 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.291957 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:44:45.295497 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.295471 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.295722 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.295625 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.295722 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.295683 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-tls\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.295875 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.295760 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.295875 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.295801 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.295875 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.295832 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b144d91d-2877-4f0b-b92b-f76f99b00d41-metrics-client-ca\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.295875 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.295862 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc6pn\" (UniqueName: \"kubernetes.io/projected/b144d91d-2877-4f0b-b92b-f76f99b00d41-kube-api-access-cc6pn\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.296098 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.295895 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-grpc-tls\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.296733 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:45.296706 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod218f2a4a_e36b_43db_996a_9a881d2a1117.slice/crio-2d5ba66ac4ad089225a85f46899353881edbe26a66feb2343f0fe34e47f532b4 WatchSource:0}: Error finding container 2d5ba66ac4ad089225a85f46899353881edbe26a66feb2343f0fe34e47f532b4: Status 404 returned error can't find the container with id 2d5ba66ac4ad089225a85f46899353881edbe26a66feb2343f0fe34e47f532b4 Mar 18 16:44:45.397414 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.396701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.397414 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.396763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-tls\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.397414 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.396826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.397414 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.396864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.397414 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.396895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b144d91d-2877-4f0b-b92b-f76f99b00d41-metrics-client-ca\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.397414 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.396926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc6pn\" (UniqueName: \"kubernetes.io/projected/b144d91d-2877-4f0b-b92b-f76f99b00d41-kube-api-access-cc6pn\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.397414 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.396972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-grpc-tls\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.397414 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.397007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.398296 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.398268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b144d91d-2877-4f0b-b92b-f76f99b00d41-metrics-client-ca\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.400031 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.399979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.400967 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.400925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.401321 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.401293 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-grpc-tls\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.401321 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.401306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-tls\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.401656 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.401633 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.402930 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.402894 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b144d91d-2877-4f0b-b92b-f76f99b00d41-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.406481 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.406447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc6pn\" (UniqueName: \"kubernetes.io/projected/b144d91d-2877-4f0b-b92b-f76f99b00d41-kube-api-access-cc6pn\") pod \"thanos-querier-548bb99f76-vtpmv\" (UID: \"b144d91d-2877-4f0b-b92b-f76f99b00d41\") " pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.529214 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.529126 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:45.928376 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.928341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6f5fd49b-mcp9n" event={"ID":"eba9ba07-b5db-4044-a877-b7ec62dc1ef7","Type":"ContainerStarted","Data":"8171624d57c27ba2e41bbb5b07cb82344f48f13db7dc098639933106935f691b"} Mar 18 16:44:45.929538 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.929509 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerStarted","Data":"2d5ba66ac4ad089225a85f46899353881edbe26a66feb2343f0fe34e47f532b4"} Mar 18 16:44:45.931490 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.931460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5jfgb" event={"ID":"8b4e9753-8c03-4237-855f-f207bb536cc9","Type":"ContainerStarted","Data":"808a503004925e3da35a2f960ba3f11f46a98f0780430c9a35c887b11be667ba"} Mar 18 16:44:45.931602 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.931495 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5jfgb" event={"ID":"8b4e9753-8c03-4237-855f-f207bb536cc9","Type":"ContainerStarted","Data":"e92b599e2970f151e43b61712378f6a9af5e01e2eeaa01cd3d9f6dd5a243c97b"} Mar 18 16:44:45.951756 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:45.950475 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5jfgb" podStartSLOduration=2.2483995820000002 podStartE2EDuration="2.95045551s" podCreationTimestamp="2026-03-18 16:44:43 +0000 UTC" firstStartedPulling="2026-03-18 16:44:43.455100258 +0000 UTC m=+53.325724416" lastFinishedPulling="2026-03-18 16:44:44.157156184 +0000 UTC m=+54.027780344" observedRunningTime="2026-03-18 16:44:45.950089446 +0000 UTC m=+55.820713623" watchObservedRunningTime="2026-03-18 16:44:45.95045551 +0000 UTC m=+55.821079690" Mar 18 16:44:46.109563 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:46.109536 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-548bb99f76-vtpmv"] Mar 18 16:44:46.113804 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:46.113753 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb144d91d_2877_4f0b_b92b_f76f99b00d41.slice/crio-7b38d0e330ff1b5bf01ba483dc974210e94b929e69d36765571d8ca1c3a88339 WatchSource:0}: Error finding container 7b38d0e330ff1b5bf01ba483dc974210e94b929e69d36765571d8ca1c3a88339: Status 404 returned error can't find the container with id 7b38d0e330ff1b5bf01ba483dc974210e94b929e69d36765571d8ca1c3a88339 Mar 18 16:44:46.937916 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:46.937832 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" event={"ID":"f1c58dd1-8573-4833-a0b4-fc571c8853cd","Type":"ContainerStarted","Data":"accd82a80365cfb5408c43e81ae3c3cf572c9adf78a5d25c4a01eed7c7ab28c4"} Mar 18 16:44:46.937916 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:46.937879 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" event={"ID":"f1c58dd1-8573-4833-a0b4-fc571c8853cd","Type":"ContainerStarted","Data":"ed7b152b299b10294cfd605942256e0f0277e9f0608267f7cde6644c4e360fac"} Mar 18 16:44:46.937916 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:46.937894 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" event={"ID":"f1c58dd1-8573-4833-a0b4-fc571c8853cd","Type":"ContainerStarted","Data":"8d62a548c937c7427227e52786fe84385c162ba0a24793342f67c2bbc60b63ad"} Mar 18 16:44:46.939038 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:46.939011 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" event={"ID":"b144d91d-2877-4f0b-b92b-f76f99b00d41","Type":"ContainerStarted","Data":"7b38d0e330ff1b5bf01ba483dc974210e94b929e69d36765571d8ca1c3a88339"} Mar 18 16:44:46.941156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:46.941116 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" event={"ID":"0121a6ed-4a8e-408b-a0c5-f8ed07b6656c","Type":"ContainerStarted","Data":"feca88626c8265f632b7b023d2b01b50956a3b1a9c2d492b7047712d86d5928e"} Mar 18 16:44:46.958209 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:46.958146 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-6df7999c47-8pcdj" podStartSLOduration=2.46819996 podStartE2EDuration="3.958131413s" podCreationTimestamp="2026-03-18 16:44:43 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.467032219 +0000 UTC m=+54.337656374" lastFinishedPulling="2026-03-18 16:44:45.95696366 +0000 UTC m=+55.827587827" observedRunningTime="2026-03-18 16:44:46.957708908 +0000 UTC m=+56.828333108" watchObservedRunningTime="2026-03-18 16:44:46.958131413 +0000 UTC m=+56.828755586" Mar 18 16:44:46.976267 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:46.976189 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-zhn6r" podStartSLOduration=2.6168512010000002 podStartE2EDuration="3.976171087s" podCreationTimestamp="2026-03-18 16:44:43 +0000 UTC" firstStartedPulling="2026-03-18 16:44:44.593455032 +0000 UTC m=+54.464079187" lastFinishedPulling="2026-03-18 16:44:45.952774903 +0000 UTC m=+55.823399073" observedRunningTime="2026-03-18 16:44:46.97551147 +0000 UTC m=+56.846135647" watchObservedRunningTime="2026-03-18 16:44:46.976171087 +0000 UTC m=+56.846795265" Mar 18 16:44:47.519428 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.519373 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5f4d67fd7-9p74k"] Mar 18 16:44:47.538233 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.538201 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5f4d67fd7-9p74k"] Mar 18 16:44:47.538433 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.538343 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.540938 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.540882 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Mar 18 16:44:47.541087 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.540886 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-8499n\"" Mar 18 16:44:47.542093 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.541855 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Mar 18 16:44:47.542093 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.541898 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9dpcet2q2ernu\"" Mar 18 16:44:47.542093 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.541925 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Mar 18 16:44:47.542366 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.542197 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Mar 18 16:44:47.623787 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.623691 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-secret-metrics-server-tls\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.623787 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.623741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.624016 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.623878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-client-ca-bundle\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.624016 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.623933 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82wr4\" (UniqueName: \"kubernetes.io/projected/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-kube-api-access-82wr4\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.624458 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.624239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-metrics-server-audit-profiles\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.624570 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.624312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-audit-log\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.624630 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.624607 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-secret-metrics-server-client-certs\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.725668 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.725625 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-secret-metrics-server-tls\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.725863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.725682 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.725863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.725738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-client-ca-bundle\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.725863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.725779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82wr4\" (UniqueName: \"kubernetes.io/projected/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-kube-api-access-82wr4\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.725863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.725806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-metrics-server-audit-profiles\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.725863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.725849 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-audit-log\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.726118 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.725900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-secret-metrics-server-client-certs\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.726583 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.726546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.726889 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.726857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-metrics-server-audit-profiles\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.727049 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.727026 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-audit-log\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.728722 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.728677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-secret-metrics-server-tls\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.729290 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.729261 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-secret-metrics-server-client-certs\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.729422 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.729316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-client-ca-bundle\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.734548 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.734525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82wr4\" (UniqueName: \"kubernetes.io/projected/f23cb088-2776-4ccb-a0bb-7b7ae3587e23-kube-api-access-82wr4\") pod \"metrics-server-5f4d67fd7-9p74k\" (UID: \"f23cb088-2776-4ccb-a0bb-7b7ae3587e23\") " pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.852188 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.851826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:44:47.916154 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.916100 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5"] Mar 18 16:44:47.939843 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.939812 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5"] Mar 18 16:44:47.940295 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.939981 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" Mar 18 16:44:47.942283 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.942259 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Mar 18 16:44:47.942443 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:47.942262 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-5xljg\"" Mar 18 16:44:48.029624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:48.029584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/77b54a08-71b5-4551-bbc4-dda4f8381c0a-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-l2qp5\" (UID: \"77b54a08-71b5-4551-bbc4-dda4f8381c0a\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" Mar 18 16:44:48.131232 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:48.131152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/77b54a08-71b5-4551-bbc4-dda4f8381c0a-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-l2qp5\" (UID: \"77b54a08-71b5-4551-bbc4-dda4f8381c0a\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" Mar 18 16:44:48.133997 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:48.133970 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/77b54a08-71b5-4551-bbc4-dda4f8381c0a-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-l2qp5\" (UID: \"77b54a08-71b5-4551-bbc4-dda4f8381c0a\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" Mar 18 16:44:48.250195 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:48.250157 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" Mar 18 16:44:49.020850 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.020795 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5"] Mar 18 16:44:49.041756 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.041592 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5f4d67fd7-9p74k"] Mar 18 16:44:49.073928 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:49.073892 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77b54a08_71b5_4551_bbc4_dda4f8381c0a.slice/crio-116f66a21675f057ce06a2bf5665c1582c9f7f3fe048f023328af707151b85ae WatchSource:0}: Error finding container 116f66a21675f057ce06a2bf5665c1582c9f7f3fe048f023328af707151b85ae: Status 404 returned error can't find the container with id 116f66a21675f057ce06a2bf5665c1582c9f7f3fe048f023328af707151b85ae Mar 18 16:44:49.074143 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:49.074119 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf23cb088_2776_4ccb_a0bb_7b7ae3587e23.slice/crio-176452eea5106425ccd2267feeab5b1dab3b92bfba9d220f10ace94bb1c5ac38 WatchSource:0}: Error finding container 176452eea5106425ccd2267feeab5b1dab3b92bfba9d220f10ace94bb1c5ac38: Status 404 returned error can't find the container with id 176452eea5106425ccd2267feeab5b1dab3b92bfba9d220f10ace94bb1c5ac38 Mar 18 16:44:49.374820 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.374742 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:44:49.378378 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.378358 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.380716 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.380691 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 18 16:44:49.380811 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.380691 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 18 16:44:49.380952 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.380937 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 18 16:44:49.381020 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.380979 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 18 16:44:49.381071 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.381015 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 18 16:44:49.381145 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.381130 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rlz7r\"" Mar 18 16:44:49.381404 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.381373 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-58j6hgpg9qvhh\"" Mar 18 16:44:49.381929 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.381725 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 18 16:44:49.381929 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.381780 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 18 16:44:49.381929 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.381796 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 18 16:44:49.381929 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.381813 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 18 16:44:49.381929 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.381830 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 18 16:44:49.381929 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.381916 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 18 16:44:49.383821 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.383804 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 18 16:44:49.394150 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.394125 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:44:49.444808 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.444772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.444808 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.444813 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445023 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.444837 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445023 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.444859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445023 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.444941 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445023 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.444970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77svp\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-kube-api-access-77svp\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445023 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445162 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445162 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445162 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445162 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-config-out\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445277 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445181 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445277 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445277 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445227 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-config\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445277 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445249 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445418 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445418 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445308 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.445418 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.445332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-web-config\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.546628 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.546817 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.546817 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.546817 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-config-out\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.546817 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.546817 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546771 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.546817 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546800 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-config\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546919 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-web-config\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546960 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.546990 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.547015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.547042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.547083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.547127 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.547109 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77svp\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-kube-api-access-77svp\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.548146 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.547777 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.550415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.550107 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.550415 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.550218 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.550595 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.550456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.551278 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.550920 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.551278 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.551004 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.551888 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.551862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.552779 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.552477 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-config\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.552779 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.552590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.552779 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.552598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.552779 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.552672 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.553009 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.552882 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-web-config\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.553104 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.553079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.553778 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.553735 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.554303 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.554262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.555198 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.555151 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.555346 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.555324 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-config-out\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.555946 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.555927 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.558001 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.557978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77svp\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-kube-api-access-77svp\") pod \"prometheus-k8s-0\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.689111 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.688553 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:44:49.850371 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.849129 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:44:49.855120 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:49.854958 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7533a106_68bc_47a5_83a7_ff29fc52f046.slice/crio-7450c53d926d42019bf3ff0d5f7bb8907f26e0ec585e7b0d83a846e6d1622a21 WatchSource:0}: Error finding container 7450c53d926d42019bf3ff0d5f7bb8907f26e0ec585e7b0d83a846e6d1622a21: Status 404 returned error can't find the container with id 7450c53d926d42019bf3ff0d5f7bb8907f26e0ec585e7b0d83a846e6d1622a21 Mar 18 16:44:49.866054 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.865698 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfhr" Mar 18 16:44:49.952196 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.952132 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" event={"ID":"77b54a08-71b5-4551-bbc4-dda4f8381c0a","Type":"ContainerStarted","Data":"116f66a21675f057ce06a2bf5665c1582c9f7f3fe048f023328af707151b85ae"} Mar 18 16:44:49.955792 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.954636 2573 generic.go:358] "Generic (PLEG): container finished" podID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerID="5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864" exitCode=0 Mar 18 16:44:49.955792 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.955053 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerDied","Data":"5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864"} Mar 18 16:44:49.958937 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.958837 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" event={"ID":"b144d91d-2877-4f0b-b92b-f76f99b00d41","Type":"ContainerStarted","Data":"7a54265360987f98d0e3bcb28882ffcdb0608a6890878ae2279018da06bc414a"} Mar 18 16:44:49.958937 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.958872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" event={"ID":"b144d91d-2877-4f0b-b92b-f76f99b00d41","Type":"ContainerStarted","Data":"5049cde28de1292bc3fd0c91a6551b27c3da51942d783cdbaba0b6984d036a7d"} Mar 18 16:44:49.958937 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.958896 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" event={"ID":"b144d91d-2877-4f0b-b92b-f76f99b00d41","Type":"ContainerStarted","Data":"8e93156e57208a2ec5276ab1e54162393d1c53d7fde1190cfd8baf1aa4412a05"} Mar 18 16:44:49.962222 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.961322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerStarted","Data":"f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91"} Mar 18 16:44:49.962222 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.961386 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerStarted","Data":"7450c53d926d42019bf3ff0d5f7bb8907f26e0ec585e7b0d83a846e6d1622a21"} Mar 18 16:44:49.963651 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.963611 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6f5fd49b-mcp9n" event={"ID":"eba9ba07-b5db-4044-a877-b7ec62dc1ef7","Type":"ContainerStarted","Data":"3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba"} Mar 18 16:44:49.965803 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:49.965685 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" event={"ID":"f23cb088-2776-4ccb-a0bb-7b7ae3587e23","Type":"ContainerStarted","Data":"176452eea5106425ccd2267feeab5b1dab3b92bfba9d220f10ace94bb1c5ac38"} Mar 18 16:44:50.005435 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:50.005256 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b6f5fd49b-mcp9n" podStartSLOduration=2.300375948 podStartE2EDuration="6.005241523s" podCreationTimestamp="2026-03-18 16:44:44 +0000 UTC" firstStartedPulling="2026-03-18 16:44:45.174000109 +0000 UTC m=+55.044624264" lastFinishedPulling="2026-03-18 16:44:48.878865674 +0000 UTC m=+58.749489839" observedRunningTime="2026-03-18 16:44:50.004722451 +0000 UTC m=+59.875346629" watchObservedRunningTime="2026-03-18 16:44:50.005241523 +0000 UTC m=+59.875865703" Mar 18 16:44:50.124945 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:50.124915 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b6f5fd49b-mcp9n"] Mar 18 16:44:50.970075 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:50.969974 2573 generic.go:358] "Generic (PLEG): container finished" podID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerID="f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91" exitCode=0 Mar 18 16:44:50.970154 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:50.970105 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerDied","Data":"f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91"} Mar 18 16:44:51.915704 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:51.915676 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8bzfj" Mar 18 16:44:51.983572 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:51.983536 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" event={"ID":"b144d91d-2877-4f0b-b92b-f76f99b00d41","Type":"ContainerStarted","Data":"eb6fa9b350090d6cedf2691a8801385459430174392a6606a6f3859c00fae531"} Mar 18 16:44:51.983769 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:51.983581 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" event={"ID":"b144d91d-2877-4f0b-b92b-f76f99b00d41","Type":"ContainerStarted","Data":"879650115a38db0fcf28bcaf0625c782aad10cbb057189fb8084664761f457f7"} Mar 18 16:44:51.983769 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:51.983594 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" event={"ID":"b144d91d-2877-4f0b-b92b-f76f99b00d41","Type":"ContainerStarted","Data":"c9c656b7ce13c3428cc2d2653911986382ee21167700d56307a26b2222a368dc"} Mar 18 16:44:51.983769 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:51.983702 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:51.985647 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:51.985617 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" event={"ID":"f23cb088-2776-4ccb-a0bb-7b7ae3587e23","Type":"ContainerStarted","Data":"b8c8e10e899a3f989e711f9f0eedb047ad7ee98b08d94898946f6480555c8fa3"} Mar 18 16:44:51.987828 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:51.987734 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" event={"ID":"77b54a08-71b5-4551-bbc4-dda4f8381c0a","Type":"ContainerStarted","Data":"ef7190c796ff4a32ac1b64d2b89d21e98616bd8dc1621629ef14b73dddd8681d"} Mar 18 16:44:52.014319 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:52.013844 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" podStartSLOduration=2.234175254 podStartE2EDuration="7.013828895s" podCreationTimestamp="2026-03-18 16:44:45 +0000 UTC" firstStartedPulling="2026-03-18 16:44:46.116383575 +0000 UTC m=+55.987007734" lastFinishedPulling="2026-03-18 16:44:50.896037204 +0000 UTC m=+60.766661375" observedRunningTime="2026-03-18 16:44:52.011649292 +0000 UTC m=+61.882273493" watchObservedRunningTime="2026-03-18 16:44:52.013828895 +0000 UTC m=+61.884453072" Mar 18 16:44:52.034414 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:52.033968 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" podStartSLOduration=3.213740487 podStartE2EDuration="5.033952357s" podCreationTimestamp="2026-03-18 16:44:47 +0000 UTC" firstStartedPulling="2026-03-18 16:44:49.076450669 +0000 UTC m=+58.947074831" lastFinishedPulling="2026-03-18 16:44:50.896662547 +0000 UTC m=+60.767286701" observedRunningTime="2026-03-18 16:44:52.033475783 +0000 UTC m=+61.904099974" watchObservedRunningTime="2026-03-18 16:44:52.033952357 +0000 UTC m=+61.904576533" Mar 18 16:44:52.047729 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:52.047403 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" podStartSLOduration=3.223725923 podStartE2EDuration="5.047369998s" podCreationTimestamp="2026-03-18 16:44:47 +0000 UTC" firstStartedPulling="2026-03-18 16:44:49.07626511 +0000 UTC m=+58.946889265" lastFinishedPulling="2026-03-18 16:44:50.899909186 +0000 UTC m=+60.770533340" observedRunningTime="2026-03-18 16:44:52.046555149 +0000 UTC m=+61.917179326" watchObservedRunningTime="2026-03-18 16:44:52.047369998 +0000 UTC m=+61.917994176" Mar 18 16:44:52.999523 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:52.999479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerStarted","Data":"db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc"} Mar 18 16:44:52.999523 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:52.999527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerStarted","Data":"ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768"} Mar 18 16:44:53.000053 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:52.999542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerStarted","Data":"7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a"} Mar 18 16:44:53.000212 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:53.000162 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" Mar 18 16:44:53.000212 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:53.000201 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerStarted","Data":"b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8"} Mar 18 16:44:53.000409 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:53.000221 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerStarted","Data":"79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a"} Mar 18 16:44:53.000409 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:53.000235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerStarted","Data":"98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8"} Mar 18 16:44:53.005797 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:53.005776 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-l2qp5" Mar 18 16:44:53.030976 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:53.030907 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.433332498 podStartE2EDuration="9.030889281s" podCreationTimestamp="2026-03-18 16:44:44 +0000 UTC" firstStartedPulling="2026-03-18 16:44:45.299038089 +0000 UTC m=+55.169662251" lastFinishedPulling="2026-03-18 16:44:51.896594877 +0000 UTC m=+61.767219034" observedRunningTime="2026-03-18 16:44:53.030228218 +0000 UTC m=+62.900852397" watchObservedRunningTime="2026-03-18 16:44:53.030889281 +0000 UTC m=+62.901513435" Mar 18 16:44:54.006058 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:54.006024 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerStarted","Data":"2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43"} Mar 18 16:44:54.006494 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:54.006071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerStarted","Data":"7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc"} Mar 18 16:44:54.006494 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:54.006086 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerStarted","Data":"f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c"} Mar 18 16:44:54.006494 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:54.006099 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerStarted","Data":"80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d"} Mar 18 16:44:55.011927 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:55.011895 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerStarted","Data":"98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e"} Mar 18 16:44:55.011927 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:55.011931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerStarted","Data":"118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e"} Mar 18 16:44:55.017462 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:55.017441 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:44:56.418854 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.418815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:56.421087 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.421067 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:44:56.431977 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.431954 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f9982fef-c82a-4b1f-8622-337551d7ec32-metrics-certs\") pod \"network-metrics-daemon-97pm9\" (UID: \"f9982fef-c82a-4b1f-8622-337551d7ec32\") " pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:56.621121 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.621085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:56.623525 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.623502 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:44:56.634062 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.634042 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:44:56.645180 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.645153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fklth\" (UniqueName: \"kubernetes.io/projected/c2cd13a8-578c-4371-b5e8-7ef2af59364b-kube-api-access-fklth\") pod \"network-check-target-l99p6\" (UID: \"c2cd13a8-578c-4371-b5e8-7ef2af59364b\") " pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:56.721931 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.721847 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nk2x6\"" Mar 18 16:44:56.727878 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.727856 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqk66\"" Mar 18 16:44:56.730542 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.730525 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:44:56.736357 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.736332 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-97pm9" Mar 18 16:44:56.858144 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.858083 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.113549534 podStartE2EDuration="7.858061847s" podCreationTimestamp="2026-03-18 16:44:49 +0000 UTC" firstStartedPulling="2026-03-18 16:44:50.971641857 +0000 UTC m=+60.842266025" lastFinishedPulling="2026-03-18 16:44:53.716154174 +0000 UTC m=+63.586778338" observedRunningTime="2026-03-18 16:44:55.060662975 +0000 UTC m=+64.931287152" watchObservedRunningTime="2026-03-18 16:44:56.858061847 +0000 UTC m=+66.728686024" Mar 18 16:44:56.859963 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.859938 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-97pm9"] Mar 18 16:44:56.864344 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:56.864315 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9982fef_c82a_4b1f_8622_337551d7ec32.slice/crio-ae6121e9746fdf7fde5e7eb5203b6b1e6dfc8d030bdb26f3a2f131c33b167992 WatchSource:0}: Error finding container ae6121e9746fdf7fde5e7eb5203b6b1e6dfc8d030bdb26f3a2f131c33b167992: Status 404 returned error can't find the container with id ae6121e9746fdf7fde5e7eb5203b6b1e6dfc8d030bdb26f3a2f131c33b167992 Mar 18 16:44:56.881637 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:56.881610 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l99p6"] Mar 18 16:44:56.886154 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:44:56.886126 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2cd13a8_578c_4371_b5e8_7ef2af59364b.slice/crio-5935ebdc5d8f4c68d539b9998f15ffaeb3e791e0c09c8082b148c746b04a04f6 WatchSource:0}: Error finding container 5935ebdc5d8f4c68d539b9998f15ffaeb3e791e0c09c8082b148c746b04a04f6: Status 404 returned error can't find the container with id 5935ebdc5d8f4c68d539b9998f15ffaeb3e791e0c09c8082b148c746b04a04f6 Mar 18 16:44:57.018441 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:57.018325 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-97pm9" event={"ID":"f9982fef-c82a-4b1f-8622-337551d7ec32","Type":"ContainerStarted","Data":"ae6121e9746fdf7fde5e7eb5203b6b1e6dfc8d030bdb26f3a2f131c33b167992"} Mar 18 16:44:57.019445 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:57.019422 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l99p6" event={"ID":"c2cd13a8-578c-4371-b5e8-7ef2af59364b","Type":"ContainerStarted","Data":"5935ebdc5d8f4c68d539b9998f15ffaeb3e791e0c09c8082b148c746b04a04f6"} Mar 18 16:44:58.008325 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:58.008294 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-548bb99f76-vtpmv" Mar 18 16:44:59.028468 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:59.028384 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-97pm9" event={"ID":"f9982fef-c82a-4b1f-8622-337551d7ec32","Type":"ContainerStarted","Data":"d65ec65c2c60fee12fa90fce1398bc204c0ab6d26dbf5dfe5d8607bc38d4b004"} Mar 18 16:44:59.028468 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:59.028473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-97pm9" event={"ID":"f9982fef-c82a-4b1f-8622-337551d7ec32","Type":"ContainerStarted","Data":"e7e32daaf054933c485766360b95be09adbc970323fbcab535c7117b7c005397"} Mar 18 16:44:59.047281 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:59.047222 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-97pm9" podStartSLOduration=67.69807264 podStartE2EDuration="1m9.047207449s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:44:56.866340674 +0000 UTC m=+66.736964827" lastFinishedPulling="2026-03-18 16:44:58.215475482 +0000 UTC m=+68.086099636" observedRunningTime="2026-03-18 16:44:59.044694233 +0000 UTC m=+68.915318409" watchObservedRunningTime="2026-03-18 16:44:59.047207449 +0000 UTC m=+68.917831624" Mar 18 16:44:59.689971 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:44:59.689935 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:01.035548 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:01.035519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l99p6" event={"ID":"c2cd13a8-578c-4371-b5e8-7ef2af59364b","Type":"ContainerStarted","Data":"58b2df3636c7c3b57be62fc0e9072afc4e39f8a7f4ca65de4bce7ebb09413946"} Mar 18 16:45:01.035954 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:01.035619 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:45:01.050483 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:01.050420 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-l99p6" podStartSLOduration=66.787118005 podStartE2EDuration="1m10.05038201s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="2026-03-18 16:44:56.888108922 +0000 UTC m=+66.758733076" lastFinishedPulling="2026-03-18 16:45:00.151372913 +0000 UTC m=+70.021997081" observedRunningTime="2026-03-18 16:45:01.049893713 +0000 UTC m=+70.920517888" watchObservedRunningTime="2026-03-18 16:45:01.05038201 +0000 UTC m=+70.921006187" Mar 18 16:45:07.852274 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:07.852237 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:45:07.852642 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:07.852286 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:45:17.011638 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.011504 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b6f5fd49b-mcp9n" podUID="eba9ba07-b5db-4044-a877-b7ec62dc1ef7" containerName="console" containerID="cri-o://3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba" gracePeriod=15 Mar 18 16:45:17.262661 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.262600 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b6f5fd49b-mcp9n_eba9ba07-b5db-4044-a877-b7ec62dc1ef7/console/0.log" Mar 18 16:45:17.262789 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.262676 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:45:17.297602 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.297561 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-trusted-ca-bundle\") pod \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " Mar 18 16:45:17.297762 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.297727 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-oauth-config\") pod \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " Mar 18 16:45:17.297818 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.297795 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-service-ca\") pod \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " Mar 18 16:45:17.297872 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.297842 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-serving-cert\") pod \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " Mar 18 16:45:17.297923 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.297906 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9kw4\" (UniqueName: \"kubernetes.io/projected/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-kube-api-access-j9kw4\") pod \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " Mar 18 16:45:17.297978 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.297934 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-config\") pod \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " Mar 18 16:45:17.298030 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.297978 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eba9ba07-b5db-4044-a877-b7ec62dc1ef7" (UID: "eba9ba07-b5db-4044-a877-b7ec62dc1ef7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:17.298295 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.298262 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-trusted-ca-bundle\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:17.298446 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.298321 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-service-ca" (OuterVolumeSpecName: "service-ca") pod "eba9ba07-b5db-4044-a877-b7ec62dc1ef7" (UID: "eba9ba07-b5db-4044-a877-b7ec62dc1ef7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:17.298446 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.298329 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-config" (OuterVolumeSpecName: "console-config") pod "eba9ba07-b5db-4044-a877-b7ec62dc1ef7" (UID: "eba9ba07-b5db-4044-a877-b7ec62dc1ef7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:17.300238 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.300215 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eba9ba07-b5db-4044-a877-b7ec62dc1ef7" (UID: "eba9ba07-b5db-4044-a877-b7ec62dc1ef7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:17.300334 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.300253 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eba9ba07-b5db-4044-a877-b7ec62dc1ef7" (UID: "eba9ba07-b5db-4044-a877-b7ec62dc1ef7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:17.300334 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.300317 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-kube-api-access-j9kw4" (OuterVolumeSpecName: "kube-api-access-j9kw4") pod "eba9ba07-b5db-4044-a877-b7ec62dc1ef7" (UID: "eba9ba07-b5db-4044-a877-b7ec62dc1ef7"). InnerVolumeSpecName "kube-api-access-j9kw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:45:17.398757 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.398720 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-oauth-serving-cert\") pod \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\" (UID: \"eba9ba07-b5db-4044-a877-b7ec62dc1ef7\") " Mar 18 16:45:17.398961 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.398945 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-serving-cert\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:17.399012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.398968 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j9kw4\" (UniqueName: \"kubernetes.io/projected/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-kube-api-access-j9kw4\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:17.399012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.398983 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-config\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:17.399012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.398997 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-console-oauth-config\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:17.399100 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.399010 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-service-ca\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:17.399132 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.399108 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eba9ba07-b5db-4044-a877-b7ec62dc1ef7" (UID: "eba9ba07-b5db-4044-a877-b7ec62dc1ef7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:17.503989 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:17.503951 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eba9ba07-b5db-4044-a877-b7ec62dc1ef7-oauth-serving-cert\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:18.086680 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.086653 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b6f5fd49b-mcp9n_eba9ba07-b5db-4044-a877-b7ec62dc1ef7/console/0.log" Mar 18 16:45:18.087104 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.086693 2573 generic.go:358] "Generic (PLEG): container finished" podID="eba9ba07-b5db-4044-a877-b7ec62dc1ef7" containerID="3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba" exitCode=2 Mar 18 16:45:18.087104 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.086755 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6f5fd49b-mcp9n" event={"ID":"eba9ba07-b5db-4044-a877-b7ec62dc1ef7","Type":"ContainerDied","Data":"3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba"} Mar 18 16:45:18.087104 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.086781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6f5fd49b-mcp9n" event={"ID":"eba9ba07-b5db-4044-a877-b7ec62dc1ef7","Type":"ContainerDied","Data":"8171624d57c27ba2e41bbb5b07cb82344f48f13db7dc098639933106935f691b"} Mar 18 16:45:18.087104 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.086785 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6f5fd49b-mcp9n" Mar 18 16:45:18.087104 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.086795 2573 scope.go:117] "RemoveContainer" containerID="3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba" Mar 18 16:45:18.096096 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.096079 2573 scope.go:117] "RemoveContainer" containerID="3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba" Mar 18 16:45:18.096368 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:18.096348 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba\": container with ID starting with 3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba not found: ID does not exist" containerID="3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba" Mar 18 16:45:18.096428 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.096377 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba"} err="failed to get container status \"3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba\": rpc error: code = NotFound desc = could not find container \"3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba\": container with ID starting with 3bf83782be9fb7767521e79fe0b5c6612acdc248795af4a704094fa4fef865ba not found: ID does not exist" Mar 18 16:45:18.106249 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.106214 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b6f5fd49b-mcp9n"] Mar 18 16:45:18.111806 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.111775 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b6f5fd49b-mcp9n"] Mar 18 16:45:18.714217 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:18.714185 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba9ba07-b5db-4044-a877-b7ec62dc1ef7" path="/var/lib/kubelet/pods/eba9ba07-b5db-4044-a877-b7ec62dc1ef7/volumes" Mar 18 16:45:27.857698 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:27.857670 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:45:27.861592 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:27.861568 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5f4d67fd7-9p74k" Mar 18 16:45:32.042483 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:32.042451 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-l99p6" Mar 18 16:45:46.716147 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:46.716116 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:46.737761 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:46.737736 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:47.189780 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:47.189753 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:53.457913 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:53.457878 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:45:53.458412 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:53.458334 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="alertmanager" containerID="cri-o://98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8" gracePeriod=120 Mar 18 16:45:53.458569 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:53.458401 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy-metric" containerID="cri-o://ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768" gracePeriod=120 Mar 18 16:45:53.458569 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:53.458470 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy" containerID="cri-o://7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a" gracePeriod=120 Mar 18 16:45:53.458569 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:53.458456 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy-web" containerID="cri-o://b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8" gracePeriod=120 Mar 18 16:45:53.458569 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:53.458508 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="config-reloader" containerID="cri-o://79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a" gracePeriod=120 Mar 18 16:45:53.458795 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:53.458617 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="prom-label-proxy" containerID="cri-o://db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc" gracePeriod=120 Mar 18 16:45:54.197132 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.197101 2573 generic.go:358] "Generic (PLEG): container finished" podID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerID="db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc" exitCode=0 Mar 18 16:45:54.197132 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.197124 2573 generic.go:358] "Generic (PLEG): container finished" podID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerID="7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a" exitCode=0 Mar 18 16:45:54.197132 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.197131 2573 generic.go:358] "Generic (PLEG): container finished" podID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerID="79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a" exitCode=0 Mar 18 16:45:54.197132 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.197136 2573 generic.go:358] "Generic (PLEG): container finished" podID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerID="98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8" exitCode=0 Mar 18 16:45:54.197439 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.197182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerDied","Data":"db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc"} Mar 18 16:45:54.197439 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.197214 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerDied","Data":"7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a"} Mar 18 16:45:54.197439 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.197224 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerDied","Data":"79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a"} Mar 18 16:45:54.197439 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.197235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerDied","Data":"98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8"} Mar 18 16:45:54.700753 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.700731 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:54.817725 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817628 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-cluster-tls-config\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.817725 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817668 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-tls-assets\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.817725 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817687 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxg66\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-kube-api-access-hxg66\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817735 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-main-tls\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817760 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-trusted-ca-bundle\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817793 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-main-db\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817824 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-web\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817847 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-config-out\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817869 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-config-volume\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817896 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-metrics-client-ca\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817935 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.817992 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-web-config\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818492 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.818025 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-metric\") pod \"218f2a4a-e36b-43db-996a-9a881d2a1117\" (UID: \"218f2a4a-e36b-43db-996a-9a881d2a1117\") " Mar 18 16:45:54.818492 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.818154 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:45:54.818492 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.818227 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:54.818492 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.818326 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-main-db\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.818492 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.818348 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.818737 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.818718 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:54.820927 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.820897 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:45:54.821624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.821490 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-kube-api-access-hxg66" (OuterVolumeSpecName: "kube-api-access-hxg66") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "kube-api-access-hxg66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:45:54.821624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.821578 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:54.821624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.821613 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-config-volume" (OuterVolumeSpecName: "config-volume") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:54.821840 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.821813 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:54.822211 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.822186 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-config-out" (OuterVolumeSpecName: "config-out") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:45:54.822421 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.822374 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:54.822582 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.822566 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:54.825826 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.825791 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:54.832260 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.832235 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-web-config" (OuterVolumeSpecName: "web-config") pod "218f2a4a-e36b-43db-996a-9a881d2a1117" (UID: "218f2a4a-e36b-43db-996a-9a881d2a1117"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:54.919669 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919642 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-main-tls\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919669 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919666 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919669 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919677 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/218f2a4a-e36b-43db-996a-9a881d2a1117-config-out\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919891 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919686 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-config-volume\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919891 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919694 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/218f2a4a-e36b-43db-996a-9a881d2a1117-metrics-client-ca\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919891 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919704 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919891 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919712 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-web-config\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919891 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919721 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919891 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919730 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/218f2a4a-e36b-43db-996a-9a881d2a1117-cluster-tls-config\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919891 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919740 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-tls-assets\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:54.919891 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:54.919748 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hxg66\" (UniqueName: \"kubernetes.io/projected/218f2a4a-e36b-43db-996a-9a881d2a1117-kube-api-access-hxg66\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:55.202565 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.202530 2573 generic.go:358] "Generic (PLEG): container finished" podID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerID="ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768" exitCode=0 Mar 18 16:45:55.202565 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.202557 2573 generic.go:358] "Generic (PLEG): container finished" podID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerID="b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8" exitCode=0 Mar 18 16:45:55.202773 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.202585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerDied","Data":"ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768"} Mar 18 16:45:55.202773 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.202618 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerDied","Data":"b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8"} Mar 18 16:45:55.202773 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.202630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"218f2a4a-e36b-43db-996a-9a881d2a1117","Type":"ContainerDied","Data":"2d5ba66ac4ad089225a85f46899353881edbe26a66feb2343f0fe34e47f532b4"} Mar 18 16:45:55.202773 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.202633 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.202773 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.202644 2573 scope.go:117] "RemoveContainer" containerID="db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc" Mar 18 16:45:55.210903 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.210884 2573 scope.go:117] "RemoveContainer" containerID="ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768" Mar 18 16:45:55.222255 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.222235 2573 scope.go:117] "RemoveContainer" containerID="7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a" Mar 18 16:45:55.230597 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.230572 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:45:55.230694 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.230597 2573 scope.go:117] "RemoveContainer" containerID="b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8" Mar 18 16:45:55.233838 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.233812 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:45:55.238255 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.238239 2573 scope.go:117] "RemoveContainer" containerID="79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a" Mar 18 16:45:55.245155 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.245115 2573 scope.go:117] "RemoveContainer" containerID="98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8" Mar 18 16:45:55.252273 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.252248 2573 scope.go:117] "RemoveContainer" containerID="5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864" Mar 18 16:45:55.259571 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.259545 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:45:55.260015 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.259996 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eba9ba07-b5db-4044-a877-b7ec62dc1ef7" containerName="console" Mar 18 16:45:55.260065 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260020 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba9ba07-b5db-4044-a877-b7ec62dc1ef7" containerName="console" Mar 18 16:45:55.260065 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260053 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="alertmanager" Mar 18 16:45:55.260065 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260062 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="alertmanager" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260078 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="prom-label-proxy" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260086 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="prom-label-proxy" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260099 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="config-reloader" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260107 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="config-reloader" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260116 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy-web" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260124 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy-web" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260135 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy-metric" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260143 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy-metric" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260158 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="init-config-reloader" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260166 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="init-config-reloader" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260175 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy" Mar 18 16:45:55.260182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260184 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260260 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="alertmanager" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260275 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="config-reloader" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260277 2573 scope.go:117] "RemoveContainer" containerID="db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260284 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy-metric" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260293 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="prom-label-proxy" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260302 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy-web" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260312 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="eba9ba07-b5db-4044-a877-b7ec62dc1ef7" containerName="console" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260319 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" containerName="kube-rbac-proxy" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:55.260578 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc\": container with ID starting with db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc not found: ID does not exist" containerID="db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260602 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc"} err="failed to get container status \"db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc\": rpc error: code = NotFound desc = could not find container \"db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc\": container with ID starting with db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc not found: ID does not exist" Mar 18 16:45:55.260676 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260626 2573 scope.go:117] "RemoveContainer" containerID="ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768" Mar 18 16:45:55.261204 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:55.260837 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768\": container with ID starting with ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768 not found: ID does not exist" containerID="ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768" Mar 18 16:45:55.261204 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260852 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768"} err="failed to get container status \"ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768\": rpc error: code = NotFound desc = could not find container \"ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768\": container with ID starting with ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768 not found: ID does not exist" Mar 18 16:45:55.261204 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.260868 2573 scope.go:117] "RemoveContainer" containerID="7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a" Mar 18 16:45:55.261204 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:55.261079 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a\": container with ID starting with 7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a not found: ID does not exist" containerID="7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a" Mar 18 16:45:55.261204 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.261106 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a"} err="failed to get container status \"7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a\": rpc error: code = NotFound desc = could not find container \"7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a\": container with ID starting with 7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a not found: ID does not exist" Mar 18 16:45:55.261204 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.261131 2573 scope.go:117] "RemoveContainer" containerID="b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8" Mar 18 16:45:55.261507 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:55.261441 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8\": container with ID starting with b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8 not found: ID does not exist" containerID="b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8" Mar 18 16:45:55.261507 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.261467 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8"} err="failed to get container status \"b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8\": rpc error: code = NotFound desc = could not find container \"b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8\": container with ID starting with b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8 not found: ID does not exist" Mar 18 16:45:55.261507 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.261486 2573 scope.go:117] "RemoveContainer" containerID="79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a" Mar 18 16:45:55.261740 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:55.261725 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a\": container with ID starting with 79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a not found: ID does not exist" containerID="79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a" Mar 18 16:45:55.261784 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.261745 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a"} err="failed to get container status \"79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a\": rpc error: code = NotFound desc = could not find container \"79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a\": container with ID starting with 79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a not found: ID does not exist" Mar 18 16:45:55.261784 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.261760 2573 scope.go:117] "RemoveContainer" containerID="98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8" Mar 18 16:45:55.261982 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:55.261963 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8\": container with ID starting with 98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8 not found: ID does not exist" containerID="98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8" Mar 18 16:45:55.262018 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.261988 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8"} err="failed to get container status \"98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8\": rpc error: code = NotFound desc = could not find container \"98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8\": container with ID starting with 98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8 not found: ID does not exist" Mar 18 16:45:55.262018 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262002 2573 scope.go:117] "RemoveContainer" containerID="5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864" Mar 18 16:45:55.262219 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:55.262206 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864\": container with ID starting with 5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864 not found: ID does not exist" containerID="5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864" Mar 18 16:45:55.262265 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262221 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864"} err="failed to get container status \"5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864\": rpc error: code = NotFound desc = could not find container \"5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864\": container with ID starting with 5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864 not found: ID does not exist" Mar 18 16:45:55.262265 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262232 2573 scope.go:117] "RemoveContainer" containerID="db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc" Mar 18 16:45:55.262444 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262428 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc"} err="failed to get container status \"db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc\": rpc error: code = NotFound desc = could not find container \"db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc\": container with ID starting with db10b04c9fc8392e8d9f8325de032e8b094dffd308ee129973c67c4854eecccc not found: ID does not exist" Mar 18 16:45:55.262493 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262444 2573 scope.go:117] "RemoveContainer" containerID="ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768" Mar 18 16:45:55.262636 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262616 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768"} err="failed to get container status \"ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768\": rpc error: code = NotFound desc = could not find container \"ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768\": container with ID starting with ddaf3a90d086ea5afc2419b7595e7e17e667eeb38517a3a00a2676b3923ef768 not found: ID does not exist" Mar 18 16:45:55.262636 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262636 2573 scope.go:117] "RemoveContainer" containerID="7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a" Mar 18 16:45:55.262839 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262821 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a"} err="failed to get container status \"7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a\": rpc error: code = NotFound desc = could not find container \"7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a\": container with ID starting with 7719ac8ac39784f670f4b1a2e427d2c6659b8994fc77f7efd4be6a696f4e713a not found: ID does not exist" Mar 18 16:45:55.262889 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262840 2573 scope.go:117] "RemoveContainer" containerID="b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8" Mar 18 16:45:55.263014 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.262998 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8"} err="failed to get container status \"b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8\": rpc error: code = NotFound desc = could not find container \"b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8\": container with ID starting with b9891777c5d79fb950a3cd9472cdaebbc60ba21a1b4c20242718dc3be67b5af8 not found: ID does not exist" Mar 18 16:45:55.263049 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.263016 2573 scope.go:117] "RemoveContainer" containerID="79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a" Mar 18 16:45:55.263231 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.263214 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a"} err="failed to get container status \"79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a\": rpc error: code = NotFound desc = could not find container \"79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a\": container with ID starting with 79ffd409c970a79e5debaa74ff5cbcba5c0ad3c19a00513cd7787c5c2ec2916a not found: ID does not exist" Mar 18 16:45:55.263271 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.263232 2573 scope.go:117] "RemoveContainer" containerID="98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8" Mar 18 16:45:55.263459 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.263442 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8"} err="failed to get container status \"98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8\": rpc error: code = NotFound desc = could not find container \"98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8\": container with ID starting with 98c8b704aa34c2299214fe2e45460a7c64d17cef8f783058836ea6368957b2b8 not found: ID does not exist" Mar 18 16:45:55.263503 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.263460 2573 scope.go:117] "RemoveContainer" containerID="5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864" Mar 18 16:45:55.263648 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.263633 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864"} err="failed to get container status \"5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864\": rpc error: code = NotFound desc = could not find container \"5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864\": container with ID starting with 5268aec30236bd570e30e0c93bf8951be53e994cb6aeb688eb04fe6775082864 not found: ID does not exist" Mar 18 16:45:55.265367 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.265353 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.267431 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.267411 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Mar 18 16:45:55.267579 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.267416 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Mar 18 16:45:55.267579 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.267573 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-bmkrg\"" Mar 18 16:45:55.267769 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.267625 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Mar 18 16:45:55.267769 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.267633 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Mar 18 16:45:55.267769 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.267632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Mar 18 16:45:55.267769 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.267417 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Mar 18 16:45:55.267769 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.267467 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Mar 18 16:45:55.267999 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.267921 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Mar 18 16:45:55.272838 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.272821 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Mar 18 16:45:55.275492 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.275467 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:45:55.424693 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.424656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.424693 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.424694 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.424953 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.424727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.424953 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.424802 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.424953 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.424866 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.424953 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.424896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.424953 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.424932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-config-out\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.425156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.424961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.425156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.424991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-web-config\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.425156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.425015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.425156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.425046 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.425156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.425109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-config-volume\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.425156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.425148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsdc8\" (UniqueName: \"kubernetes.io/projected/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-kube-api-access-rsdc8\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.525820 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.525735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.525820 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.525775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.525820 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.525801 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-config-out\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526060 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.525830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526060 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.525943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-web-config\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526060 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.525972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526060 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.526006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526060 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.526035 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-config-volume\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526060 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.526060 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsdc8\" (UniqueName: \"kubernetes.io/projected/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-kube-api-access-rsdc8\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526354 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.526093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526354 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.526123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526354 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.526183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526354 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.526232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526354 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.526330 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.526670 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.526646 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.527304 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.527278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.529027 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.528966 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.529027 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.528977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-config-out\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.529200 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.529109 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.529440 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.529421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-web-config\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.529537 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.529489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.529647 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.529630 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.529647 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.529637 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.529822 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.529806 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.530638 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.530618 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-config-volume\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.535148 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.535130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsdc8\" (UniqueName: \"kubernetes.io/projected/742e1fd3-1ae2-4fe1-90f1-31ac5075418f-kube-api-access-rsdc8\") pod \"alertmanager-main-0\" (UID: \"742e1fd3-1ae2-4fe1-90f1-31ac5075418f\") " pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.575268 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.575236 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 18 16:45:55.700747 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:55.700723 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 18 16:45:55.703725 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:45:55.703689 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod742e1fd3_1ae2_4fe1_90f1_31ac5075418f.slice/crio-cdeec0d4049eb889c1d3cf15c340fe1973c928154ed475f1337b9fb4e9f69058 WatchSource:0}: Error finding container cdeec0d4049eb889c1d3cf15c340fe1973c928154ed475f1337b9fb4e9f69058: Status 404 returned error can't find the container with id cdeec0d4049eb889c1d3cf15c340fe1973c928154ed475f1337b9fb4e9f69058 Mar 18 16:45:56.206782 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:56.206745 2573 generic.go:358] "Generic (PLEG): container finished" podID="742e1fd3-1ae2-4fe1-90f1-31ac5075418f" containerID="8056868e36d5fe52f1023218340ffb1c766f19e1f566bd24e42cb903bfe42794" exitCode=0 Mar 18 16:45:56.206944 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:56.206837 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742e1fd3-1ae2-4fe1-90f1-31ac5075418f","Type":"ContainerDied","Data":"8056868e36d5fe52f1023218340ffb1c766f19e1f566bd24e42cb903bfe42794"} Mar 18 16:45:56.206944 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:56.206867 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742e1fd3-1ae2-4fe1-90f1-31ac5075418f","Type":"ContainerStarted","Data":"cdeec0d4049eb889c1d3cf15c340fe1973c928154ed475f1337b9fb4e9f69058"} Mar 18 16:45:56.718461 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:56.715375 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218f2a4a-e36b-43db-996a-9a881d2a1117" path="/var/lib/kubelet/pods/218f2a4a-e36b-43db-996a-9a881d2a1117/volumes" Mar 18 16:45:57.213361 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.213327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742e1fd3-1ae2-4fe1-90f1-31ac5075418f","Type":"ContainerStarted","Data":"f66f14bb444a5c9acd00dc988a1e3739d023154d54be0167280de62557f560cd"} Mar 18 16:45:57.213361 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.213363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742e1fd3-1ae2-4fe1-90f1-31ac5075418f","Type":"ContainerStarted","Data":"a32639b2e205215757b643083eb7987a4a4d98e3feab2af4b3c3a2ef907275e9"} Mar 18 16:45:57.213582 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.213373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742e1fd3-1ae2-4fe1-90f1-31ac5075418f","Type":"ContainerStarted","Data":"ff6b76d74cc3200a68b5da39acbb4c8a9fc437c1430dd22c924517f8f5732b43"} Mar 18 16:45:57.213582 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.213382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742e1fd3-1ae2-4fe1-90f1-31ac5075418f","Type":"ContainerStarted","Data":"a29beb2d2249b6da3f3eda73bff578aec19a61166c23274006124d424b44fd2d"} Mar 18 16:45:57.213582 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.213404 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742e1fd3-1ae2-4fe1-90f1-31ac5075418f","Type":"ContainerStarted","Data":"9db74f016934f98e32081c38f53297c026a10d3aaf85c99ae7f5f68e171e95ef"} Mar 18 16:45:57.213582 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.213413 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"742e1fd3-1ae2-4fe1-90f1-31ac5075418f","Type":"ContainerStarted","Data":"bf5a2acf4ff3b6c2ada4189e36e25ad945e4f715bb9f5223f8821c5ef35cac50"} Mar 18 16:45:57.240808 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.240743 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.24072794 podStartE2EDuration="2.24072794s" podCreationTimestamp="2026-03-18 16:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:45:57.239566462 +0000 UTC m=+127.110190639" watchObservedRunningTime="2026-03-18 16:45:57.24072794 +0000 UTC m=+127.111352116" Mar 18 16:45:57.486042 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.485959 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-68dd9f748c-blfgv"] Mar 18 16:45:57.491204 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.491185 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.508540 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.508510 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Mar 18 16:45:57.508710 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.508551 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Mar 18 16:45:57.508710 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.508593 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-6d8nr\"" Mar 18 16:45:57.508710 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.508510 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Mar 18 16:45:57.508710 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.508630 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Mar 18 16:45:57.508967 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.508853 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Mar 18 16:45:57.511898 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.511875 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-68dd9f748c-blfgv"] Mar 18 16:45:57.512561 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.512540 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Mar 18 16:45:57.643498 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.643453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-telemeter-client-tls\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.643674 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.643509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f491edb-828b-4cf1-b59c-520b9c8b8073-serving-certs-ca-bundle\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.643674 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.643628 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-secret-telemeter-client\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.643812 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.643679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.643812 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.643717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-federate-client-tls\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.643812 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.643745 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f491edb-828b-4cf1-b59c-520b9c8b8073-metrics-client-ca\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.643812 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.643800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbh8z\" (UniqueName: \"kubernetes.io/projected/6f491edb-828b-4cf1-b59c-520b9c8b8073-kube-api-access-qbh8z\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.643959 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.643842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f491edb-828b-4cf1-b59c-520b9c8b8073-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.745262 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.745175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-secret-telemeter-client\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.745262 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.745221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.745784 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.745292 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-federate-client-tls\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.745784 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.745311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f491edb-828b-4cf1-b59c-520b9c8b8073-metrics-client-ca\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.745784 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.745337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbh8z\" (UniqueName: \"kubernetes.io/projected/6f491edb-828b-4cf1-b59c-520b9c8b8073-kube-api-access-qbh8z\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.745784 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.745361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f491edb-828b-4cf1-b59c-520b9c8b8073-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.745784 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.745421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-telemeter-client-tls\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.745784 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.745454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f491edb-828b-4cf1-b59c-520b9c8b8073-serving-certs-ca-bundle\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.746180 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.746152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f491edb-828b-4cf1-b59c-520b9c8b8073-metrics-client-ca\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.746293 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.746187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f491edb-828b-4cf1-b59c-520b9c8b8073-serving-certs-ca-bundle\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.746293 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.746275 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f491edb-828b-4cf1-b59c-520b9c8b8073-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.748311 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.748290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-telemeter-client-tls\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.748420 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.748308 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.748546 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.748526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-secret-telemeter-client\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.748584 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.748533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6f491edb-828b-4cf1-b59c-520b9c8b8073-federate-client-tls\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.752268 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.752245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbh8z\" (UniqueName: \"kubernetes.io/projected/6f491edb-828b-4cf1-b59c-520b9c8b8073-kube-api-access-qbh8z\") pod \"telemeter-client-68dd9f748c-blfgv\" (UID: \"6f491edb-828b-4cf1-b59c-520b9c8b8073\") " pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.784082 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.784050 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:45:57.784711 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.784518 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="prometheus" containerID="cri-o://80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d" gracePeriod=600 Mar 18 16:45:57.784711 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.784554 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy" containerID="cri-o://118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e" gracePeriod=600 Mar 18 16:45:57.784711 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.784588 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="thanos-sidecar" containerID="cri-o://7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc" gracePeriod=600 Mar 18 16:45:57.784711 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.784628 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="config-reloader" containerID="cri-o://f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c" gracePeriod=600 Mar 18 16:45:57.784711 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.784586 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy-thanos" containerID="cri-o://98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e" gracePeriod=600 Mar 18 16:45:57.784711 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.784586 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy-web" containerID="cri-o://2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43" gracePeriod=600 Mar 18 16:45:57.801036 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.801016 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" Mar 18 16:45:57.948843 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:57.948810 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-68dd9f748c-blfgv"] Mar 18 16:45:57.951994 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:45:57.951969 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f491edb_828b_4cf1_b59c_520b9c8b8073.slice/crio-ae92852928d367e60ef5e9f0347394003f92495e151f71aab8934e309c7f07d6 WatchSource:0}: Error finding container ae92852928d367e60ef5e9f0347394003f92495e151f71aab8934e309c7f07d6: Status 404 returned error can't find the container with id ae92852928d367e60ef5e9f0347394003f92495e151f71aab8934e309c7f07d6 Mar 18 16:45:58.219344 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219307 2573 generic.go:358] "Generic (PLEG): container finished" podID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerID="98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e" exitCode=0 Mar 18 16:45:58.219344 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219336 2573 generic.go:358] "Generic (PLEG): container finished" podID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerID="118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e" exitCode=0 Mar 18 16:45:58.219344 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219344 2573 generic.go:358] "Generic (PLEG): container finished" podID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerID="7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc" exitCode=0 Mar 18 16:45:58.219344 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219349 2573 generic.go:358] "Generic (PLEG): container finished" podID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerID="f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c" exitCode=0 Mar 18 16:45:58.219344 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219354 2573 generic.go:358] "Generic (PLEG): container finished" podID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerID="80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d" exitCode=0 Mar 18 16:45:58.219675 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219374 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerDied","Data":"98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e"} Mar 18 16:45:58.219675 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerDied","Data":"118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e"} Mar 18 16:45:58.219675 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219438 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerDied","Data":"7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc"} Mar 18 16:45:58.219675 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerDied","Data":"f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c"} Mar 18 16:45:58.219675 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.219457 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerDied","Data":"80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d"} Mar 18 16:45:58.220468 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:58.220448 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" event={"ID":"6f491edb-828b-4cf1-b59c-520b9c8b8073","Type":"ContainerStarted","Data":"ae92852928d367e60ef5e9f0347394003f92495e151f71aab8934e309c7f07d6"} Mar 18 16:45:59.035225 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.034522 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.159811 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.159778 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-grpc-tls\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160020 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.159819 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-tls\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160020 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.159848 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-metrics-client-certs\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160020 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.159871 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-serving-certs-ca-bundle\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160020 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.159922 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-web-config\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160020 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.159962 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-rulefiles-0\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160020 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160005 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-tls-assets\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160026 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-config\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160072 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-kubelet-serving-ca-bundle\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160099 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160143 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-db\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160183 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-kube-rbac-proxy\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160209 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160237 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-config-out\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160266 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-thanos-prometheus-http-client-file\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160333 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160316 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77svp\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-kube-api-access-77svp\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160765 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160350 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-metrics-client-ca\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.160765 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160385 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-trusted-ca-bundle\") pod \"7533a106-68bc-47a5-83a7-ff29fc52f046\" (UID: \"7533a106-68bc-47a5-83a7-ff29fc52f046\") " Mar 18 16:45:59.161558 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.160774 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:59.161558 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.161032 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:59.161558 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.161266 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:59.163633 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.162483 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:59.163633 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.162510 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:45:59.163633 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.163598 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:45:59.164135 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.164098 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:59.164703 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.164679 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-kube-api-access-77svp" (OuterVolumeSpecName: "kube-api-access-77svp") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "kube-api-access-77svp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:45:59.165706 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.165672 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-config" (OuterVolumeSpecName: "config") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:59.166053 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.166005 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:59.166143 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.166108 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:59.166143 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.166130 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:45:59.166253 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.166145 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:59.166753 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.166717 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-config-out" (OuterVolumeSpecName: "config-out") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:45:59.166841 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.166769 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:59.167971 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.167934 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:59.168257 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.168234 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:59.178751 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.178716 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-web-config" (OuterVolumeSpecName: "web-config") pod "7533a106-68bc-47a5-83a7-ff29fc52f046" (UID: "7533a106-68bc-47a5-83a7-ff29fc52f046"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:45:59.227272 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.227235 2573 generic.go:358] "Generic (PLEG): container finished" podID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerID="2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43" exitCode=0 Mar 18 16:45:59.227469 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.227319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerDied","Data":"2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43"} Mar 18 16:45:59.227469 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.227368 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7533a106-68bc-47a5-83a7-ff29fc52f046","Type":"ContainerDied","Data":"7450c53d926d42019bf3ff0d5f7bb8907f26e0ec585e7b0d83a846e6d1622a21"} Mar 18 16:45:59.227469 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.227414 2573 scope.go:117] "RemoveContainer" containerID="98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e" Mar 18 16:45:59.227469 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.227431 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.253724 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.253692 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:45:59.258407 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.258357 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:45:59.261863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261832 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-metrics-client-ca\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.261863 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261865 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-trusted-ca-bundle\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261878 2573 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-grpc-tls\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261895 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-tls\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261913 2573 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-metrics-client-certs\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261925 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261934 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-web-config\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261948 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261962 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-tls-assets\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261976 2573 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-config\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261985 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7533a106-68bc-47a5-83a7-ff29fc52f046-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.261994 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.262003 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-prometheus-k8s-db\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.262015 2573 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-kube-rbac-proxy\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.262029 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.262043 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7533a106-68bc-47a5-83a7-ff29fc52f046-config-out\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262052 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.262054 2573 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7533a106-68bc-47a5-83a7-ff29fc52f046-thanos-prometheus-http-client-file\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.262591 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.262067 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-77svp\" (UniqueName: \"kubernetes.io/projected/7533a106-68bc-47a5-83a7-ff29fc52f046-kube-api-access-77svp\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:45:59.287719 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.287626 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:45:59.288025 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288008 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy-thanos" Mar 18 16:45:59.288087 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288028 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy-thanos" Mar 18 16:45:59.288087 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288042 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy-web" Mar 18 16:45:59.288087 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288051 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy-web" Mar 18 16:45:59.288087 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288068 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="config-reloader" Mar 18 16:45:59.288087 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288076 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="config-reloader" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288092 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288100 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288113 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="prometheus" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288121 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="prometheus" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288129 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="init-config-reloader" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288136 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="init-config-reloader" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288145 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="thanos-sidecar" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288153 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="thanos-sidecar" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288228 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="prometheus" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288241 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="config-reloader" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288253 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy-web" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288264 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="thanos-sidecar" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288274 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy-thanos" Mar 18 16:45:59.288314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.288285 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" containerName="kube-rbac-proxy" Mar 18 16:45:59.293190 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.293167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.295948 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.295848 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rlz7r\"" Mar 18 16:45:59.295948 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.295909 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 18 16:45:59.295948 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.295913 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 18 16:45:59.296228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.295983 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 18 16:45:59.296228 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.296042 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-58j6hgpg9qvhh\"" Mar 18 16:45:59.297513 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.297475 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 18 16:45:59.297513 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.297502 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 18 16:45:59.297693 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.297664 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 18 16:45:59.298675 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.298657 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 18 16:45:59.298675 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.298668 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 18 16:45:59.298827 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.298667 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 18 16:45:59.298827 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.298671 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 18 16:45:59.308124 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.308104 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 18 16:45:59.309605 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.309585 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 18 16:45:59.314164 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.314143 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:45:59.464188 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464383 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-config\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464383 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464383 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464383 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e923bbe4-b772-44cf-8535-fb84dcbc15c0-config-out\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464383 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464377 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e923bbe4-b772-44cf-8535-fb84dcbc15c0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464730 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464437 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e923bbe4-b772-44cf-8535-fb84dcbc15c0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464730 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464458 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464730 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464730 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464730 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464549 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464730 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464635 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464730 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464730 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.464730 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-web-config\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.465170 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vnt7\" (UniqueName: \"kubernetes.io/projected/e923bbe4-b772-44cf-8535-fb84dcbc15c0-kube-api-access-9vnt7\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.465170 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.465170 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.464800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565270 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-config\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565270 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565236 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565270 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565590 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e923bbe4-b772-44cf-8535-fb84dcbc15c0-config-out\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565590 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e923bbe4-b772-44cf-8535-fb84dcbc15c0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565590 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565323 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e923bbe4-b772-44cf-8535-fb84dcbc15c0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565793 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565854 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565800 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565854 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565814 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e923bbe4-b772-44cf-8535-fb84dcbc15c0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.565854 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566001 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566001 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566001 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565951 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566001 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.565979 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.566004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-web-config\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.566027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vnt7\" (UniqueName: \"kubernetes.io/projected/e923bbe4-b772-44cf-8535-fb84dcbc15c0-kube-api-access-9vnt7\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.566073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.566098 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566182 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.566151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566451 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.566197 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.566991 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.566819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.568354 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.568327 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e923bbe4-b772-44cf-8535-fb84dcbc15c0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.568494 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.568442 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e923bbe4-b772-44cf-8535-fb84dcbc15c0-config-out\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.568562 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.568510 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-config\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.568562 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.568542 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.568668 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.568629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.569347 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.569089 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.569347 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.569278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.569347 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.569329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.569580 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.569374 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.569997 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.569967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.571231 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.571193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.571326 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.571262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-web-config\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.571607 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.571588 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e923bbe4-b772-44cf-8535-fb84dcbc15c0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.572037 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.572018 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e923bbe4-b772-44cf-8535-fb84dcbc15c0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.576153 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.576134 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vnt7\" (UniqueName: \"kubernetes.io/projected/e923bbe4-b772-44cf-8535-fb84dcbc15c0-kube-api-access-9vnt7\") pod \"prometheus-k8s-0\" (UID: \"e923bbe4-b772-44cf-8535-fb84dcbc15c0\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.604756 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.604708 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:45:59.627972 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.627948 2573 scope.go:117] "RemoveContainer" containerID="118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e" Mar 18 16:45:59.635627 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.635607 2573 scope.go:117] "RemoveContainer" containerID="2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43" Mar 18 16:45:59.642534 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.642510 2573 scope.go:117] "RemoveContainer" containerID="7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc" Mar 18 16:45:59.649759 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.649737 2573 scope.go:117] "RemoveContainer" containerID="f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c" Mar 18 16:45:59.683722 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.683677 2573 scope.go:117] "RemoveContainer" containerID="80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d" Mar 18 16:45:59.692226 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.692207 2573 scope.go:117] "RemoveContainer" containerID="f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91" Mar 18 16:45:59.699520 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.699479 2573 scope.go:117] "RemoveContainer" containerID="98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e" Mar 18 16:45:59.699804 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:59.699781 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e\": container with ID starting with 98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e not found: ID does not exist" containerID="98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e" Mar 18 16:45:59.699858 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.699819 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e"} err="failed to get container status \"98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e\": rpc error: code = NotFound desc = could not find container \"98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e\": container with ID starting with 98ab186780404a88cba480a01de2635a0d746848dfc74ae35d7396954126845e not found: ID does not exist" Mar 18 16:45:59.699858 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.699840 2573 scope.go:117] "RemoveContainer" containerID="118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e" Mar 18 16:45:59.700092 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:59.700076 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e\": container with ID starting with 118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e not found: ID does not exist" containerID="118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e" Mar 18 16:45:59.700156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.700095 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e"} err="failed to get container status \"118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e\": rpc error: code = NotFound desc = could not find container \"118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e\": container with ID starting with 118e2b9d1b61a194c730bcf9f719bfa7bf392f7ba8ef2a49616fef4c47fd9a5e not found: ID does not exist" Mar 18 16:45:59.700156 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.700112 2573 scope.go:117] "RemoveContainer" containerID="2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43" Mar 18 16:45:59.700344 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:59.700328 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43\": container with ID starting with 2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43 not found: ID does not exist" containerID="2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43" Mar 18 16:45:59.700420 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.700348 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43"} err="failed to get container status \"2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43\": rpc error: code = NotFound desc = could not find container \"2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43\": container with ID starting with 2160b136d1886a4eb459f54ab3a1df640d616c0d2e0a5978506bc5f279a0df43 not found: ID does not exist" Mar 18 16:45:59.700420 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.700361 2573 scope.go:117] "RemoveContainer" containerID="7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc" Mar 18 16:45:59.700623 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:59.700604 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc\": container with ID starting with 7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc not found: ID does not exist" containerID="7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc" Mar 18 16:45:59.700660 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.700627 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc"} err="failed to get container status \"7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc\": rpc error: code = NotFound desc = could not find container \"7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc\": container with ID starting with 7f9e75fbc46be0e2f88fd0bf1e632092711a5420ff8886825358b29ce388edbc not found: ID does not exist" Mar 18 16:45:59.700660 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.700640 2573 scope.go:117] "RemoveContainer" containerID="f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c" Mar 18 16:45:59.700861 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:59.700842 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c\": container with ID starting with f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c not found: ID does not exist" containerID="f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c" Mar 18 16:45:59.700946 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.700865 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c"} err="failed to get container status \"f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c\": rpc error: code = NotFound desc = could not find container \"f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c\": container with ID starting with f683432238123e7ec281595c3f6371ac7da36964e27dec712822e62f02a5fd0c not found: ID does not exist" Mar 18 16:45:59.700946 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.700879 2573 scope.go:117] "RemoveContainer" containerID="80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d" Mar 18 16:45:59.701111 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:59.701093 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d\": container with ID starting with 80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d not found: ID does not exist" containerID="80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d" Mar 18 16:45:59.701154 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.701115 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d"} err="failed to get container status \"80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d\": rpc error: code = NotFound desc = could not find container \"80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d\": container with ID starting with 80842b1c19fa2637c678e854a51d130d8181d19696abfa5e9ab97b1dc49e4a6d not found: ID does not exist" Mar 18 16:45:59.701154 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.701129 2573 scope.go:117] "RemoveContainer" containerID="f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91" Mar 18 16:45:59.701367 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:45:59.701353 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91\": container with ID starting with f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91 not found: ID does not exist" containerID="f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91" Mar 18 16:45:59.701423 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.701371 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91"} err="failed to get container status \"f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91\": rpc error: code = NotFound desc = could not find container \"f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91\": container with ID starting with f2845fa1254d114a3071d750734b580fb4dfec59a2b7fb89e85f5956e831eb91 not found: ID does not exist" Mar 18 16:45:59.759914 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:45:59.759889 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 18 16:45:59.762283 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:45:59.762251 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode923bbe4_b772_44cf_8535_fb84dcbc15c0.slice/crio-fb29a179b9f5fa4636df82578922eedd45a0199c334067f753231fcba5a0891d WatchSource:0}: Error finding container fb29a179b9f5fa4636df82578922eedd45a0199c334067f753231fcba5a0891d: Status 404 returned error can't find the container with id fb29a179b9f5fa4636df82578922eedd45a0199c334067f753231fcba5a0891d Mar 18 16:46:00.233527 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:00.233483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" event={"ID":"6f491edb-828b-4cf1-b59c-520b9c8b8073","Type":"ContainerStarted","Data":"96627048e3c249a4aa226812e4df4d0a553f1a9adf5a8980b8e7d16727e544f4"} Mar 18 16:46:00.233527 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:00.233529 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" event={"ID":"6f491edb-828b-4cf1-b59c-520b9c8b8073","Type":"ContainerStarted","Data":"2be54568491544274f2f6a5c851424c61829f98ccca8d1986457e26032765e54"} Mar 18 16:46:00.234033 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:00.233543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" event={"ID":"6f491edb-828b-4cf1-b59c-520b9c8b8073","Type":"ContainerStarted","Data":"28fedb8c7844a1f6ae81918cc53bf4177be1774b896902358a59cf8eee4e433c"} Mar 18 16:46:00.234817 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:00.234792 2573 generic.go:358] "Generic (PLEG): container finished" podID="e923bbe4-b772-44cf-8535-fb84dcbc15c0" containerID="861482a9d393dc9e4f339e2541d02d123fd11e43eeed886309d0f2cf6989dced" exitCode=0 Mar 18 16:46:00.234926 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:00.234869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e923bbe4-b772-44cf-8535-fb84dcbc15c0","Type":"ContainerDied","Data":"861482a9d393dc9e4f339e2541d02d123fd11e43eeed886309d0f2cf6989dced"} Mar 18 16:46:00.234926 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:00.234900 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e923bbe4-b772-44cf-8535-fb84dcbc15c0","Type":"ContainerStarted","Data":"fb29a179b9f5fa4636df82578922eedd45a0199c334067f753231fcba5a0891d"} Mar 18 16:46:00.255786 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:00.255733 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-68dd9f748c-blfgv" podStartSLOduration=1.521410834 podStartE2EDuration="3.255718676s" podCreationTimestamp="2026-03-18 16:45:57 +0000 UTC" firstStartedPulling="2026-03-18 16:45:57.953802505 +0000 UTC m=+127.824426662" lastFinishedPulling="2026-03-18 16:45:59.688110345 +0000 UTC m=+129.558734504" observedRunningTime="2026-03-18 16:46:00.25402291 +0000 UTC m=+130.124647086" watchObservedRunningTime="2026-03-18 16:46:00.255718676 +0000 UTC m=+130.126343280" Mar 18 16:46:00.714956 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:00.714925 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7533a106-68bc-47a5-83a7-ff29fc52f046" path="/var/lib/kubelet/pods/7533a106-68bc-47a5-83a7-ff29fc52f046/volumes" Mar 18 16:46:01.241289 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:01.241242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e923bbe4-b772-44cf-8535-fb84dcbc15c0","Type":"ContainerStarted","Data":"d2f015a5bf473b8633752d684453dc74babddb03219e7493ee383081b860f318"} Mar 18 16:46:01.241688 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:01.241296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e923bbe4-b772-44cf-8535-fb84dcbc15c0","Type":"ContainerStarted","Data":"32f647be56dcaf68fc52eebe52d7d20208d77b1b92fb9f1f2ee1c4ffdf5e1f62"} Mar 18 16:46:01.241688 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:01.241310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e923bbe4-b772-44cf-8535-fb84dcbc15c0","Type":"ContainerStarted","Data":"c9f487e97f4a80dce1516cd823928cc2d06b8a2bc3d164819b5331361c087dd0"} Mar 18 16:46:01.241688 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:01.241322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e923bbe4-b772-44cf-8535-fb84dcbc15c0","Type":"ContainerStarted","Data":"a22146c579c101127c42565fa6d34f94cbdd8b30bfaddf52436f3e6bb473c23f"} Mar 18 16:46:01.241688 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:01.241333 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e923bbe4-b772-44cf-8535-fb84dcbc15c0","Type":"ContainerStarted","Data":"18221097a0de2d3426533e5404a24e7c76bd134e0295a3b60e83876478cd8a1e"} Mar 18 16:46:01.241688 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:01.241344 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e923bbe4-b772-44cf-8535-fb84dcbc15c0","Type":"ContainerStarted","Data":"c45c0535b823b5fdeae98f44c12e8ef7e270d51780fd721cefefb7da72a3aa4c"} Mar 18 16:46:01.278506 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:01.278409 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.27837949 podStartE2EDuration="2.27837949s" podCreationTimestamp="2026-03-18 16:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:46:01.276469121 +0000 UTC m=+131.147093288" watchObservedRunningTime="2026-03-18 16:46:01.27837949 +0000 UTC m=+131.149003665" Mar 18 16:46:04.605841 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:04.605793 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:46:59.605091 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:59.605045 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:46:59.620517 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:46:59.620488 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:47:00.441223 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:47:00.441200 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 18 16:48:00.380141 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.380103 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-sfj8n"] Mar 18 16:48:00.383502 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.383479 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.385542 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.385523 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:48:00.392719 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.392697 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sfj8n"] Mar 18 16:48:00.477638 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.477602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bd8bd61d-840f-4e63-af60-47dded682df3-kubelet-config\") pod \"global-pull-secret-syncer-sfj8n\" (UID: \"bd8bd61d-840f-4e63-af60-47dded682df3\") " pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.477814 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.477644 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bd8bd61d-840f-4e63-af60-47dded682df3-original-pull-secret\") pod \"global-pull-secret-syncer-sfj8n\" (UID: \"bd8bd61d-840f-4e63-af60-47dded682df3\") " pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.477814 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.477712 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bd8bd61d-840f-4e63-af60-47dded682df3-dbus\") pod \"global-pull-secret-syncer-sfj8n\" (UID: \"bd8bd61d-840f-4e63-af60-47dded682df3\") " pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.578161 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.578120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bd8bd61d-840f-4e63-af60-47dded682df3-dbus\") pod \"global-pull-secret-syncer-sfj8n\" (UID: \"bd8bd61d-840f-4e63-af60-47dded682df3\") " pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.578328 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.578179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bd8bd61d-840f-4e63-af60-47dded682df3-kubelet-config\") pod \"global-pull-secret-syncer-sfj8n\" (UID: \"bd8bd61d-840f-4e63-af60-47dded682df3\") " pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.578328 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.578212 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bd8bd61d-840f-4e63-af60-47dded682df3-original-pull-secret\") pod \"global-pull-secret-syncer-sfj8n\" (UID: \"bd8bd61d-840f-4e63-af60-47dded682df3\") " pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.578328 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.578316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bd8bd61d-840f-4e63-af60-47dded682df3-dbus\") pod \"global-pull-secret-syncer-sfj8n\" (UID: \"bd8bd61d-840f-4e63-af60-47dded682df3\") " pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.578462 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.578319 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bd8bd61d-840f-4e63-af60-47dded682df3-kubelet-config\") pod \"global-pull-secret-syncer-sfj8n\" (UID: \"bd8bd61d-840f-4e63-af60-47dded682df3\") " pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.580484 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.580465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bd8bd61d-840f-4e63-af60-47dded682df3-original-pull-secret\") pod \"global-pull-secret-syncer-sfj8n\" (UID: \"bd8bd61d-840f-4e63-af60-47dded682df3\") " pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.693707 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.693616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sfj8n" Mar 18 16:48:00.817862 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:00.817826 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sfj8n"] Mar 18 16:48:00.821198 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:48:00.821167 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd8bd61d_840f_4e63_af60_47dded682df3.slice/crio-4fca3e77776edac68f879efbdb1d255e5f4ea9d6e65689d6a00650342b12407d WatchSource:0}: Error finding container 4fca3e77776edac68f879efbdb1d255e5f4ea9d6e65689d6a00650342b12407d: Status 404 returned error can't find the container with id 4fca3e77776edac68f879efbdb1d255e5f4ea9d6e65689d6a00650342b12407d Mar 18 16:48:01.609277 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:01.609240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sfj8n" event={"ID":"bd8bd61d-840f-4e63-af60-47dded682df3","Type":"ContainerStarted","Data":"4fca3e77776edac68f879efbdb1d255e5f4ea9d6e65689d6a00650342b12407d"} Mar 18 16:48:05.622955 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:05.622921 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sfj8n" event={"ID":"bd8bd61d-840f-4e63-af60-47dded682df3","Type":"ContainerStarted","Data":"fc5e2d1643bbb8d1a0fb1e26542c0cc34a4ae1850be2310ce3ec4da8879fa176"} Mar 18 16:48:05.640500 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:05.640443 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-sfj8n" podStartSLOduration=1.24941119 podStartE2EDuration="5.640424162s" podCreationTimestamp="2026-03-18 16:48:00 +0000 UTC" firstStartedPulling="2026-03-18 16:48:00.822879975 +0000 UTC m=+250.693504134" lastFinishedPulling="2026-03-18 16:48:05.213892934 +0000 UTC m=+255.084517106" observedRunningTime="2026-03-18 16:48:05.638896218 +0000 UTC m=+255.509520394" watchObservedRunningTime="2026-03-18 16:48:05.640424162 +0000 UTC m=+255.511048341" Mar 18 16:48:50.599247 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:48:50.599220 2573 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:49:54.187054 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.187018 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-748c497bc-9fhb8"] Mar 18 16:49:54.190285 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.190263 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-748c497bc-9fhb8" Mar 18 16:49:54.192405 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.192375 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:49:54.192527 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.192488 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-h8m8v\"" Mar 18 16:49:54.193051 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.193027 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Mar 18 16:49:54.193051 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.193050 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:49:54.199674 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.199648 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-748c497bc-9fhb8"] Mar 18 16:49:54.227950 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.227920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrslg\" (UniqueName: \"kubernetes.io/projected/ab5793e7-2304-4e54-ac32-abc53eb88fa7-kube-api-access-zrslg\") pod \"seaweedfs-748c497bc-9fhb8\" (UID: \"ab5793e7-2304-4e54-ac32-abc53eb88fa7\") " pod="kserve/seaweedfs-748c497bc-9fhb8" Mar 18 16:49:54.228119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.227976 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ab5793e7-2304-4e54-ac32-abc53eb88fa7-data\") pod \"seaweedfs-748c497bc-9fhb8\" (UID: \"ab5793e7-2304-4e54-ac32-abc53eb88fa7\") " pod="kserve/seaweedfs-748c497bc-9fhb8" Mar 18 16:49:54.328548 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.328515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrslg\" (UniqueName: \"kubernetes.io/projected/ab5793e7-2304-4e54-ac32-abc53eb88fa7-kube-api-access-zrslg\") pod \"seaweedfs-748c497bc-9fhb8\" (UID: \"ab5793e7-2304-4e54-ac32-abc53eb88fa7\") " pod="kserve/seaweedfs-748c497bc-9fhb8" Mar 18 16:49:54.328689 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.328648 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ab5793e7-2304-4e54-ac32-abc53eb88fa7-data\") pod \"seaweedfs-748c497bc-9fhb8\" (UID: \"ab5793e7-2304-4e54-ac32-abc53eb88fa7\") " pod="kserve/seaweedfs-748c497bc-9fhb8" Mar 18 16:49:54.328989 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.328973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ab5793e7-2304-4e54-ac32-abc53eb88fa7-data\") pod \"seaweedfs-748c497bc-9fhb8\" (UID: \"ab5793e7-2304-4e54-ac32-abc53eb88fa7\") " pod="kserve/seaweedfs-748c497bc-9fhb8" Mar 18 16:49:54.336740 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.336705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrslg\" (UniqueName: \"kubernetes.io/projected/ab5793e7-2304-4e54-ac32-abc53eb88fa7-kube-api-access-zrslg\") pod \"seaweedfs-748c497bc-9fhb8\" (UID: \"ab5793e7-2304-4e54-ac32-abc53eb88fa7\") " pod="kserve/seaweedfs-748c497bc-9fhb8" Mar 18 16:49:54.500140 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.500044 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-748c497bc-9fhb8" Mar 18 16:49:54.618844 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.618810 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-748c497bc-9fhb8"] Mar 18 16:49:54.623253 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:49:54.623225 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab5793e7_2304_4e54_ac32_abc53eb88fa7.slice/crio-5e1173150140b5a6c7a4db0b5207b4b0acea83de3f83dbce1c135f20a518297c WatchSource:0}: Error finding container 5e1173150140b5a6c7a4db0b5207b4b0acea83de3f83dbce1c135f20a518297c: Status 404 returned error can't find the container with id 5e1173150140b5a6c7a4db0b5207b4b0acea83de3f83dbce1c135f20a518297c Mar 18 16:49:54.624453 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.624437 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:49:54.937135 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:54.937092 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-748c497bc-9fhb8" event={"ID":"ab5793e7-2304-4e54-ac32-abc53eb88fa7","Type":"ContainerStarted","Data":"5e1173150140b5a6c7a4db0b5207b4b0acea83de3f83dbce1c135f20a518297c"} Mar 18 16:49:55.517806 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.517776 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-8g4s5"] Mar 18 16:49:55.521640 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.521616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:49:55.524778 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.524753 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-2zc9n\"" Mar 18 16:49:55.524899 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.524792 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Mar 18 16:49:55.530971 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.530522 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-8g4s5"] Mar 18 16:49:55.538123 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.538098 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd41a339-2609-4ede-ba5e-c5feefadb0ed-cert\") pod \"kserve-controller-manager-69d7c9bbdc-8g4s5\" (UID: \"bd41a339-2609-4ede-ba5e-c5feefadb0ed\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:49:55.538250 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.538152 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxq4v\" (UniqueName: \"kubernetes.io/projected/bd41a339-2609-4ede-ba5e-c5feefadb0ed-kube-api-access-dxq4v\") pod \"kserve-controller-manager-69d7c9bbdc-8g4s5\" (UID: \"bd41a339-2609-4ede-ba5e-c5feefadb0ed\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:49:55.639194 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.639152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd41a339-2609-4ede-ba5e-c5feefadb0ed-cert\") pod \"kserve-controller-manager-69d7c9bbdc-8g4s5\" (UID: \"bd41a339-2609-4ede-ba5e-c5feefadb0ed\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:49:55.639387 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.639218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxq4v\" (UniqueName: \"kubernetes.io/projected/bd41a339-2609-4ede-ba5e-c5feefadb0ed-kube-api-access-dxq4v\") pod \"kserve-controller-manager-69d7c9bbdc-8g4s5\" (UID: \"bd41a339-2609-4ede-ba5e-c5feefadb0ed\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:49:55.647496 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.642567 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd41a339-2609-4ede-ba5e-c5feefadb0ed-cert\") pod \"kserve-controller-manager-69d7c9bbdc-8g4s5\" (UID: \"bd41a339-2609-4ede-ba5e-c5feefadb0ed\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:49:55.650838 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.650777 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxq4v\" (UniqueName: \"kubernetes.io/projected/bd41a339-2609-4ede-ba5e-c5feefadb0ed-kube-api-access-dxq4v\") pod \"kserve-controller-manager-69d7c9bbdc-8g4s5\" (UID: \"bd41a339-2609-4ede-ba5e-c5feefadb0ed\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:49:55.835640 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.835603 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:49:55.961047 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:55.960975 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-8g4s5"] Mar 18 16:49:55.963681 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:49:55.963651 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd41a339_2609_4ede_ba5e_c5feefadb0ed.slice/crio-5223d24536be10e7cf88be589f1e72d34653e075eb26bb42b2c2412041eee76d WatchSource:0}: Error finding container 5223d24536be10e7cf88be589f1e72d34653e075eb26bb42b2c2412041eee76d: Status 404 returned error can't find the container with id 5223d24536be10e7cf88be589f1e72d34653e075eb26bb42b2c2412041eee76d Mar 18 16:49:56.945028 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:56.944986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" event={"ID":"bd41a339-2609-4ede-ba5e-c5feefadb0ed","Type":"ContainerStarted","Data":"5223d24536be10e7cf88be589f1e72d34653e075eb26bb42b2c2412041eee76d"} Mar 18 16:49:59.956297 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:59.956262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" event={"ID":"bd41a339-2609-4ede-ba5e-c5feefadb0ed","Type":"ContainerStarted","Data":"ae3a96f4ad3c44e55816a5e78c04d7fdc7f525ee2086fae72e80359a6122d87b"} Mar 18 16:49:59.956751 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:59.956439 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:49:59.958114 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:59.958083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-748c497bc-9fhb8" event={"ID":"ab5793e7-2304-4e54-ac32-abc53eb88fa7","Type":"ContainerStarted","Data":"45846e77a86ea865abac177b8602e9024324cf346d636b8cdbf2f0070c2413ea"} Mar 18 16:49:59.971320 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:59.971258 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" podStartSLOduration=1.7992302150000001 podStartE2EDuration="4.971240816s" podCreationTimestamp="2026-03-18 16:49:55 +0000 UTC" firstStartedPulling="2026-03-18 16:49:55.965530545 +0000 UTC m=+365.836154699" lastFinishedPulling="2026-03-18 16:49:59.137541143 +0000 UTC m=+369.008165300" observedRunningTime="2026-03-18 16:49:59.970687838 +0000 UTC m=+369.841312015" watchObservedRunningTime="2026-03-18 16:49:59.971240816 +0000 UTC m=+369.841864993" Mar 18 16:49:59.986041 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:49:59.985991 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-748c497bc-9fhb8" podStartSLOduration=1.418123458 podStartE2EDuration="5.985977368s" podCreationTimestamp="2026-03-18 16:49:54 +0000 UTC" firstStartedPulling="2026-03-18 16:49:54.624551569 +0000 UTC m=+364.495175723" lastFinishedPulling="2026-03-18 16:49:59.192405462 +0000 UTC m=+369.063029633" observedRunningTime="2026-03-18 16:49:59.984333273 +0000 UTC m=+369.854957448" watchObservedRunningTime="2026-03-18 16:49:59.985977368 +0000 UTC m=+369.856601544" Mar 18 16:50:30.966378 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:30.966343 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-8g4s5" Mar 18 16:50:47.861244 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:47.861211 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-gdvxs"] Mar 18 16:50:47.864295 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:47.864280 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-gdvxs" Mar 18 16:50:47.871672 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:47.871650 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-gdvxs"] Mar 18 16:50:47.891314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:47.891290 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz565\" (UniqueName: \"kubernetes.io/projected/91a770c1-5a7c-4885-86c2-99635d5626a8-kube-api-access-mz565\") pod \"s3-init-gdvxs\" (UID: \"91a770c1-5a7c-4885-86c2-99635d5626a8\") " pod="kserve/s3-init-gdvxs" Mar 18 16:50:47.991705 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:47.991672 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mz565\" (UniqueName: \"kubernetes.io/projected/91a770c1-5a7c-4885-86c2-99635d5626a8-kube-api-access-mz565\") pod \"s3-init-gdvxs\" (UID: \"91a770c1-5a7c-4885-86c2-99635d5626a8\") " pod="kserve/s3-init-gdvxs" Mar 18 16:50:48.000132 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:48.000101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz565\" (UniqueName: \"kubernetes.io/projected/91a770c1-5a7c-4885-86c2-99635d5626a8-kube-api-access-mz565\") pod \"s3-init-gdvxs\" (UID: \"91a770c1-5a7c-4885-86c2-99635d5626a8\") " pod="kserve/s3-init-gdvxs" Mar 18 16:50:48.186153 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:48.186067 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-gdvxs" Mar 18 16:50:48.301795 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:48.301468 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-gdvxs"] Mar 18 16:50:48.304334 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:50:48.304306 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a770c1_5a7c_4885_86c2_99635d5626a8.slice/crio-8771a98fdd33014ce09382e2b5dd9a95697c002be5809ac7c2e2df0334296b50 WatchSource:0}: Error finding container 8771a98fdd33014ce09382e2b5dd9a95697c002be5809ac7c2e2df0334296b50: Status 404 returned error can't find the container with id 8771a98fdd33014ce09382e2b5dd9a95697c002be5809ac7c2e2df0334296b50 Mar 18 16:50:49.104019 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:49.103964 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-gdvxs" event={"ID":"91a770c1-5a7c-4885-86c2-99635d5626a8","Type":"ContainerStarted","Data":"8771a98fdd33014ce09382e2b5dd9a95697c002be5809ac7c2e2df0334296b50"} Mar 18 16:50:53.117139 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:53.117103 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-gdvxs" event={"ID":"91a770c1-5a7c-4885-86c2-99635d5626a8","Type":"ContainerStarted","Data":"4f5a5f72f246524960680fdb4865ad21f6e783818c7f610f6cdf12d17e7e6604"} Mar 18 16:50:53.133143 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:53.133095 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-gdvxs" podStartSLOduration=1.717632167 podStartE2EDuration="6.133079214s" podCreationTimestamp="2026-03-18 16:50:47 +0000 UTC" firstStartedPulling="2026-03-18 16:50:48.306086452 +0000 UTC m=+418.176710606" lastFinishedPulling="2026-03-18 16:50:52.721533497 +0000 UTC m=+422.592157653" observedRunningTime="2026-03-18 16:50:53.1320105 +0000 UTC m=+423.002634675" watchObservedRunningTime="2026-03-18 16:50:53.133079214 +0000 UTC m=+423.003703389" Mar 18 16:50:56.127005 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:56.126973 2573 generic.go:358] "Generic (PLEG): container finished" podID="91a770c1-5a7c-4885-86c2-99635d5626a8" containerID="4f5a5f72f246524960680fdb4865ad21f6e783818c7f610f6cdf12d17e7e6604" exitCode=0 Mar 18 16:50:56.127359 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:56.127045 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-gdvxs" event={"ID":"91a770c1-5a7c-4885-86c2-99635d5626a8","Type":"ContainerDied","Data":"4f5a5f72f246524960680fdb4865ad21f6e783818c7f610f6cdf12d17e7e6604"} Mar 18 16:50:57.251702 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:57.251675 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-gdvxs" Mar 18 16:50:57.375621 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:57.375586 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz565\" (UniqueName: \"kubernetes.io/projected/91a770c1-5a7c-4885-86c2-99635d5626a8-kube-api-access-mz565\") pod \"91a770c1-5a7c-4885-86c2-99635d5626a8\" (UID: \"91a770c1-5a7c-4885-86c2-99635d5626a8\") " Mar 18 16:50:57.377725 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:57.377698 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a770c1-5a7c-4885-86c2-99635d5626a8-kube-api-access-mz565" (OuterVolumeSpecName: "kube-api-access-mz565") pod "91a770c1-5a7c-4885-86c2-99635d5626a8" (UID: "91a770c1-5a7c-4885-86c2-99635d5626a8"). InnerVolumeSpecName "kube-api-access-mz565". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:50:57.476435 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:57.476401 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mz565\" (UniqueName: \"kubernetes.io/projected/91a770c1-5a7c-4885-86c2-99635d5626a8-kube-api-access-mz565\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:50:58.133839 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:58.133811 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-gdvxs" Mar 18 16:50:58.134020 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:58.133801 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-gdvxs" event={"ID":"91a770c1-5a7c-4885-86c2-99635d5626a8","Type":"ContainerDied","Data":"8771a98fdd33014ce09382e2b5dd9a95697c002be5809ac7c2e2df0334296b50"} Mar 18 16:50:58.134020 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:50:58.133925 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8771a98fdd33014ce09382e2b5dd9a95697c002be5809ac7c2e2df0334296b50" Mar 18 16:51:09.695550 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.695466 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fb95c978c-gxmqd"] Mar 18 16:51:09.695996 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.695812 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a770c1-5a7c-4885-86c2-99635d5626a8" containerName="s3-init" Mar 18 16:51:09.695996 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.695823 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a770c1-5a7c-4885-86c2-99635d5626a8" containerName="s3-init" Mar 18 16:51:09.695996 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.695877 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="91a770c1-5a7c-4885-86c2-99635d5626a8" containerName="s3-init" Mar 18 16:51:09.809119 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.809080 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fb95c978c-gxmqd"] Mar 18 16:51:09.809281 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.809233 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.812008 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.811984 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 18 16:51:09.812165 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.812099 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 18 16:51:09.812165 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.812126 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-l5zrn\"" Mar 18 16:51:09.812575 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.812554 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 18 16:51:09.812655 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.812629 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 18 16:51:09.812655 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.812645 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 18 16:51:09.812757 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.812648 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 18 16:51:09.812757 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.812670 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 18 16:51:09.817543 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.817526 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 18 16:51:09.878722 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.878676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhvdx\" (UniqueName: \"kubernetes.io/projected/02f06747-157a-4688-8e56-f6c8837b2273-kube-api-access-mhvdx\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.878722 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.878726 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-console-config\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.878976 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.878800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-oauth-serving-cert\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.878976 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.878845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02f06747-157a-4688-8e56-f6c8837b2273-console-serving-cert\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.878976 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.878889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-trusted-ca-bundle\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.878976 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.878967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02f06747-157a-4688-8e56-f6c8837b2273-console-oauth-config\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.879107 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.879004 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-service-ca\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.979848 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.979761 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02f06747-157a-4688-8e56-f6c8837b2273-console-oauth-config\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.979848 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.979814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-service-ca\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.979848 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.979842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhvdx\" (UniqueName: \"kubernetes.io/projected/02f06747-157a-4688-8e56-f6c8837b2273-kube-api-access-mhvdx\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.980131 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.979948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-console-config\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.980131 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.980019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-oauth-serving-cert\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.980131 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.980066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02f06747-157a-4688-8e56-f6c8837b2273-console-serving-cert\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.980131 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.980112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-trusted-ca-bundle\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.980709 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.980680 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-service-ca\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.980824 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.980724 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-oauth-serving-cert\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.980887 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.980877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-console-config\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.981100 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.981081 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02f06747-157a-4688-8e56-f6c8837b2273-trusted-ca-bundle\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.982207 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.982187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02f06747-157a-4688-8e56-f6c8837b2273-console-oauth-config\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.982567 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.982549 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02f06747-157a-4688-8e56-f6c8837b2273-console-serving-cert\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:09.987603 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:09.987579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhvdx\" (UniqueName: \"kubernetes.io/projected/02f06747-157a-4688-8e56-f6c8837b2273-kube-api-access-mhvdx\") pod \"console-5fb95c978c-gxmqd\" (UID: \"02f06747-157a-4688-8e56-f6c8837b2273\") " pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:10.119612 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:10.119573 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:10.240547 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:10.240457 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fb95c978c-gxmqd"] Mar 18 16:51:10.243736 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:51:10.243704 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f06747_157a_4688_8e56_f6c8837b2273.slice/crio-de969e2c6fd568c45b2cb0c7464811f8106fd91b72feae3aa33a945ff5e6a271 WatchSource:0}: Error finding container de969e2c6fd568c45b2cb0c7464811f8106fd91b72feae3aa33a945ff5e6a271: Status 404 returned error can't find the container with id de969e2c6fd568c45b2cb0c7464811f8106fd91b72feae3aa33a945ff5e6a271 Mar 18 16:51:11.173102 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:11.173069 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fb95c978c-gxmqd" event={"ID":"02f06747-157a-4688-8e56-f6c8837b2273","Type":"ContainerStarted","Data":"82a7fc03aeeef082cfdfb104cb0d471f1f9c6aafb9db6817a231210038337c81"} Mar 18 16:51:11.173102 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:11.173106 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fb95c978c-gxmqd" event={"ID":"02f06747-157a-4688-8e56-f6c8837b2273","Type":"ContainerStarted","Data":"de969e2c6fd568c45b2cb0c7464811f8106fd91b72feae3aa33a945ff5e6a271"} Mar 18 16:51:11.189048 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:11.188997 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fb95c978c-gxmqd" podStartSLOduration=2.188980621 podStartE2EDuration="2.188980621s" podCreationTimestamp="2026-03-18 16:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:51:11.187875688 +0000 UTC m=+441.058499863" watchObservedRunningTime="2026-03-18 16:51:11.188980621 +0000 UTC m=+441.059604801" Mar 18 16:51:20.119885 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:20.119841 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:20.119885 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:20.119888 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:20.124711 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:20.124683 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:51:20.204778 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:51:20.204753 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fb95c978c-gxmqd" Mar 18 16:54:51.270201 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.270153 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m"] Mar 18 16:54:51.275792 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.275763 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:51.276371 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.276347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/232e9e1c-3dde-4f5e-8405-045a0555dbf0-proxy-tls\") pod \"model-chainer-raw-41715-84f87fd6b7-shn4m\" (UID: \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\") " pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:51.276525 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.276477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232e9e1c-3dde-4f5e-8405-045a0555dbf0-openshift-service-ca-bundle\") pod \"model-chainer-raw-41715-84f87fd6b7-shn4m\" (UID: \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\") " pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:51.278009 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.277987 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-41715-serving-cert\"" Mar 18 16:54:51.278106 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.278011 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fnxb\"" Mar 18 16:54:51.278106 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.277991 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Mar 18 16:54:51.278697 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.278678 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-41715-kube-rbac-proxy-sar-config\"" Mar 18 16:54:51.281453 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.281435 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m"] Mar 18 16:54:51.377661 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.377628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/232e9e1c-3dde-4f5e-8405-045a0555dbf0-proxy-tls\") pod \"model-chainer-raw-41715-84f87fd6b7-shn4m\" (UID: \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\") " pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:51.377820 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.377687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232e9e1c-3dde-4f5e-8405-045a0555dbf0-openshift-service-ca-bundle\") pod \"model-chainer-raw-41715-84f87fd6b7-shn4m\" (UID: \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\") " pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:51.377820 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:54:51.377767 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-41715-serving-cert: secret "model-chainer-raw-41715-serving-cert" not found Mar 18 16:54:51.377894 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:54:51.377847 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/232e9e1c-3dde-4f5e-8405-045a0555dbf0-proxy-tls podName:232e9e1c-3dde-4f5e-8405-045a0555dbf0 nodeName:}" failed. No retries permitted until 2026-03-18 16:54:51.877827708 +0000 UTC m=+661.748451863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/232e9e1c-3dde-4f5e-8405-045a0555dbf0-proxy-tls") pod "model-chainer-raw-41715-84f87fd6b7-shn4m" (UID: "232e9e1c-3dde-4f5e-8405-045a0555dbf0") : secret "model-chainer-raw-41715-serving-cert" not found Mar 18 16:54:51.378318 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.378300 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232e9e1c-3dde-4f5e-8405-045a0555dbf0-openshift-service-ca-bundle\") pod \"model-chainer-raw-41715-84f87fd6b7-shn4m\" (UID: \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\") " pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:51.882738 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.882702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/232e9e1c-3dde-4f5e-8405-045a0555dbf0-proxy-tls\") pod \"model-chainer-raw-41715-84f87fd6b7-shn4m\" (UID: \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\") " pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:51.885095 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.885067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/232e9e1c-3dde-4f5e-8405-045a0555dbf0-proxy-tls\") pod \"model-chainer-raw-41715-84f87fd6b7-shn4m\" (UID: \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\") " pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:51.887988 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:51.887973 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:52.004490 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:52.004466 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m"] Mar 18 16:54:52.006894 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:54:52.006863 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod232e9e1c_3dde_4f5e_8405_045a0555dbf0.slice/crio-d660c4bd80253fe18fed8cdb6833cc75c8ae746f1aa66cb93468682d484e913f WatchSource:0}: Error finding container d660c4bd80253fe18fed8cdb6833cc75c8ae746f1aa66cb93468682d484e913f: Status 404 returned error can't find the container with id d660c4bd80253fe18fed8cdb6833cc75c8ae746f1aa66cb93468682d484e913f Mar 18 16:54:52.841019 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:52.840977 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" event={"ID":"232e9e1c-3dde-4f5e-8405-045a0555dbf0","Type":"ContainerStarted","Data":"d660c4bd80253fe18fed8cdb6833cc75c8ae746f1aa66cb93468682d484e913f"} Mar 18 16:54:54.848024 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:54.847934 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" event={"ID":"232e9e1c-3dde-4f5e-8405-045a0555dbf0","Type":"ContainerStarted","Data":"b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864"} Mar 18 16:54:54.848024 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:54.847988 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:54:54.870915 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:54:54.870867 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" podStartSLOduration=1.280591004 podStartE2EDuration="3.870853977s" podCreationTimestamp="2026-03-18 16:54:51 +0000 UTC" firstStartedPulling="2026-03-18 16:54:52.008680681 +0000 UTC m=+661.879304838" lastFinishedPulling="2026-03-18 16:54:54.598943657 +0000 UTC m=+664.469567811" observedRunningTime="2026-03-18 16:54:54.868776867 +0000 UTC m=+664.739401042" watchObservedRunningTime="2026-03-18 16:54:54.870853977 +0000 UTC m=+664.741478152" Mar 18 16:55:00.855922 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:00.855893 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:55:01.299368 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:01.299285 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m"] Mar 18 16:55:01.299610 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:01.299559 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" containerID="cri-o://b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864" gracePeriod=30 Mar 18 16:55:05.854624 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:05.854557 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:10.854558 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:10.854525 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:15.854845 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:15.854795 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:15.855320 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:15.854910 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:55:20.854325 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:20.854287 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:25.854643 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:25.854607 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:30.854603 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:30.854562 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:55:31.438902 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.438878 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:55:31.629533 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.629504 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232e9e1c-3dde-4f5e-8405-045a0555dbf0-openshift-service-ca-bundle\") pod \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\" (UID: \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\") " Mar 18 16:55:31.629715 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.629562 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/232e9e1c-3dde-4f5e-8405-045a0555dbf0-proxy-tls\") pod \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\" (UID: \"232e9e1c-3dde-4f5e-8405-045a0555dbf0\") " Mar 18 16:55:31.629909 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.629883 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/232e9e1c-3dde-4f5e-8405-045a0555dbf0-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "232e9e1c-3dde-4f5e-8405-045a0555dbf0" (UID: "232e9e1c-3dde-4f5e-8405-045a0555dbf0"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:55:31.631725 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.631698 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232e9e1c-3dde-4f5e-8405-045a0555dbf0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "232e9e1c-3dde-4f5e-8405-045a0555dbf0" (UID: "232e9e1c-3dde-4f5e-8405-045a0555dbf0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:55:31.730525 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.730495 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232e9e1c-3dde-4f5e-8405-045a0555dbf0-openshift-service-ca-bundle\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:55:31.730525 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.730523 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/232e9e1c-3dde-4f5e-8405-045a0555dbf0-proxy-tls\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:55:31.957165 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.957082 2573 generic.go:358] "Generic (PLEG): container finished" podID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerID="b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864" exitCode=0 Mar 18 16:55:31.957165 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.957144 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" Mar 18 16:55:31.957626 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.957175 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" event={"ID":"232e9e1c-3dde-4f5e-8405-045a0555dbf0","Type":"ContainerDied","Data":"b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864"} Mar 18 16:55:31.957626 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.957210 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m" event={"ID":"232e9e1c-3dde-4f5e-8405-045a0555dbf0","Type":"ContainerDied","Data":"d660c4bd80253fe18fed8cdb6833cc75c8ae746f1aa66cb93468682d484e913f"} Mar 18 16:55:31.957626 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.957226 2573 scope.go:117] "RemoveContainer" containerID="b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864" Mar 18 16:55:31.968014 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.967995 2573 scope.go:117] "RemoveContainer" containerID="b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864" Mar 18 16:55:31.968254 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:55:31.968234 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864\": container with ID starting with b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864 not found: ID does not exist" containerID="b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864" Mar 18 16:55:31.968314 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.968264 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864"} err="failed to get container status \"b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864\": rpc error: code = NotFound desc = could not find container \"b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864\": container with ID starting with b69374d27dfe453628fbe9cf245aad3169cef1937dbb4e82f0b54e79f8756864 not found: ID does not exist" Mar 18 16:55:31.978883 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.978863 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m"] Mar 18 16:55:31.983379 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:31.983358 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-41715-84f87fd6b7-shn4m"] Mar 18 16:55:32.714777 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:55:32.714746 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" path="/var/lib/kubelet/pods/232e9e1c-3dde-4f5e-8405-045a0555dbf0/volumes" Mar 18 16:56:51.658518 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.658483 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp"] Mar 18 16:56:51.658939 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.658870 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" Mar 18 16:56:51.658939 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.658883 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" Mar 18 16:56:51.659012 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.658964 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="232e9e1c-3dde-4f5e-8405-045a0555dbf0" containerName="model-chainer-raw-41715" Mar 18 16:56:51.660912 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.660894 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:51.662933 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.662908 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-598eb-serving-cert\"" Mar 18 16:56:51.663064 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.662910 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-598eb-kube-rbac-proxy-sar-config\"" Mar 18 16:56:51.663064 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.663035 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Mar 18 16:56:51.663595 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.663577 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fnxb\"" Mar 18 16:56:51.670672 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.670648 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp"] Mar 18 16:56:51.694305 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.694271 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp\" (UID: \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:51.694460 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.694370 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-proxy-tls\") pod \"model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp\" (UID: \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:51.795532 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.795490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp\" (UID: \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:51.795705 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.795597 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-proxy-tls\") pod \"model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp\" (UID: \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:51.795756 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:56:51.795744 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-serving-cert: secret "model-chainer-raw-hpa-598eb-serving-cert" not found Mar 18 16:56:51.795843 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:56:51.795831 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-proxy-tls podName:fbd0532b-1313-4dc0-ac0f-9c46c588c5d0 nodeName:}" failed. No retries permitted until 2026-03-18 16:56:52.295808523 +0000 UTC m=+782.166432679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-proxy-tls") pod "model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" (UID: "fbd0532b-1313-4dc0-ac0f-9c46c588c5d0") : secret "model-chainer-raw-hpa-598eb-serving-cert" not found Mar 18 16:56:51.796238 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:51.796220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp\" (UID: \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:52.299046 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:52.299007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-proxy-tls\") pod \"model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp\" (UID: \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:52.301423 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:52.301379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-proxy-tls\") pod \"model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp\" (UID: \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:52.572573 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:52.572534 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:52.690243 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:52.690212 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp"] Mar 18 16:56:52.694030 ip-10-0-135-173 kubenswrapper[2573]: W0318 16:56:52.694004 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd0532b_1313_4dc0_ac0f_9c46c588c5d0.slice/crio-edda159d28f44087dfee7d816cdd9945b0175ab23b3f30b07a45d070d45b286d WatchSource:0}: Error finding container edda159d28f44087dfee7d816cdd9945b0175ab23b3f30b07a45d070d45b286d: Status 404 returned error can't find the container with id edda159d28f44087dfee7d816cdd9945b0175ab23b3f30b07a45d070d45b286d Mar 18 16:56:52.695928 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:52.695911 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:56:53.211171 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:53.211137 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" event={"ID":"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0","Type":"ContainerStarted","Data":"7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c"} Mar 18 16:56:53.211171 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:53.211172 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" event={"ID":"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0","Type":"ContainerStarted","Data":"edda159d28f44087dfee7d816cdd9945b0175ab23b3f30b07a45d070d45b286d"} Mar 18 16:56:53.211455 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:53.211258 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:56:53.230563 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:53.230520 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" podStartSLOduration=2.230505799 podStartE2EDuration="2.230505799s" podCreationTimestamp="2026-03-18 16:56:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:56:53.228593141 +0000 UTC m=+783.099217317" watchObservedRunningTime="2026-03-18 16:56:53.230505799 +0000 UTC m=+783.101129974" Mar 18 16:56:59.219375 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:56:59.219340 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:57:01.741088 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:01.741055 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp"] Mar 18 16:57:01.741495 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:01.741271 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" containerID="cri-o://7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c" gracePeriod=30 Mar 18 16:57:04.218061 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:04.218021 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:09.217924 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:09.217889 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:14.218717 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:14.218675 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:14.219181 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:14.218781 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:57:19.218515 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:19.218477 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:24.218922 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:24.218881 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:29.218471 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:29.218431 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:57:31.878669 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:31.878646 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:57:32.038837 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.038746 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-openshift-service-ca-bundle\") pod \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\" (UID: \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\") " Mar 18 16:57:32.038837 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.038792 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-proxy-tls\") pod \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\" (UID: \"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0\") " Mar 18 16:57:32.039108 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.039083 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" (UID: "fbd0532b-1313-4dc0-ac0f-9c46c588c5d0"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:57:32.040838 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.040811 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" (UID: "fbd0532b-1313-4dc0-ac0f-9c46c588c5d0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:57:32.139746 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.139717 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-openshift-service-ca-bundle\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:57:32.139746 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.139744 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0-proxy-tls\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 16:57:32.328877 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.328844 2573 generic.go:358] "Generic (PLEG): container finished" podID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerID="7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c" exitCode=0 Mar 18 16:57:32.329062 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.328890 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" event={"ID":"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0","Type":"ContainerDied","Data":"7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c"} Mar 18 16:57:32.329062 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.328912 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" event={"ID":"fbd0532b-1313-4dc0-ac0f-9c46c588c5d0","Type":"ContainerDied","Data":"edda159d28f44087dfee7d816cdd9945b0175ab23b3f30b07a45d070d45b286d"} Mar 18 16:57:32.329062 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.328920 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp" Mar 18 16:57:32.329062 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.328926 2573 scope.go:117] "RemoveContainer" containerID="7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c" Mar 18 16:57:32.337585 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.337569 2573 scope.go:117] "RemoveContainer" containerID="7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c" Mar 18 16:57:32.337823 ip-10-0-135-173 kubenswrapper[2573]: E0318 16:57:32.337803 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c\": container with ID starting with 7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c not found: ID does not exist" containerID="7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c" Mar 18 16:57:32.337870 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.337830 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c"} err="failed to get container status \"7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c\": rpc error: code = NotFound desc = could not find container \"7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c\": container with ID starting with 7c4ff8410bb9e6139bd6e102967400aa703b2dd65d117deeb5730ba8de0a378c not found: ID does not exist" Mar 18 16:57:32.347997 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.347977 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp"] Mar 18 16:57:32.351204 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.351184 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-598eb-86b654b4cf-5jdmp"] Mar 18 16:57:32.714155 ip-10-0-135-173 kubenswrapper[2573]: I0318 16:57:32.714061 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" path="/var/lib/kubelet/pods/fbd0532b-1313-4dc0-ac0f-9c46c588c5d0/volumes" Mar 18 17:07:16.941900 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.941869 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7lrgj/must-gather-hz757"] Mar 18 17:07:16.942301 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.942199 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" Mar 18 17:07:16.942301 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.942210 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" Mar 18 17:07:16.942301 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.942268 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbd0532b-1313-4dc0-ac0f-9c46c588c5d0" containerName="model-chainer-raw-hpa-598eb" Mar 18 17:07:16.945221 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.945205 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:16.948080 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.947566 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7lrgj\"/\"openshift-service-ca.crt\"" Mar 18 17:07:16.948080 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.947882 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7lrgj\"/\"default-dockercfg-xprq8\"" Mar 18 17:07:16.948080 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.947945 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7lrgj\"/\"kube-root-ca.crt\"" Mar 18 17:07:16.953957 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.953931 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7lrgj/must-gather-hz757"] Mar 18 17:07:16.995374 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.995347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/38313260-b7df-4b9a-9a29-78a128b9a617-must-gather-output\") pod \"must-gather-hz757\" (UID: \"38313260-b7df-4b9a-9a29-78a128b9a617\") " pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:16.995518 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:16.995476 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thzlt\" (UniqueName: \"kubernetes.io/projected/38313260-b7df-4b9a-9a29-78a128b9a617-kube-api-access-thzlt\") pod \"must-gather-hz757\" (UID: \"38313260-b7df-4b9a-9a29-78a128b9a617\") " pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:17.096513 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:17.096482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thzlt\" (UniqueName: \"kubernetes.io/projected/38313260-b7df-4b9a-9a29-78a128b9a617-kube-api-access-thzlt\") pod \"must-gather-hz757\" (UID: \"38313260-b7df-4b9a-9a29-78a128b9a617\") " pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:17.096686 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:17.096532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/38313260-b7df-4b9a-9a29-78a128b9a617-must-gather-output\") pod \"must-gather-hz757\" (UID: \"38313260-b7df-4b9a-9a29-78a128b9a617\") " pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:17.096818 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:17.096802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/38313260-b7df-4b9a-9a29-78a128b9a617-must-gather-output\") pod \"must-gather-hz757\" (UID: \"38313260-b7df-4b9a-9a29-78a128b9a617\") " pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:17.104550 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:17.104522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thzlt\" (UniqueName: \"kubernetes.io/projected/38313260-b7df-4b9a-9a29-78a128b9a617-kube-api-access-thzlt\") pod \"must-gather-hz757\" (UID: \"38313260-b7df-4b9a-9a29-78a128b9a617\") " pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:17.269103 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:17.269012 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:17.390656 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:17.390629 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7lrgj/must-gather-hz757"] Mar 18 17:07:17.394040 ip-10-0-135-173 kubenswrapper[2573]: W0318 17:07:17.394012 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38313260_b7df_4b9a_9a29_78a128b9a617.slice/crio-662a50c621c690c627e3a6cde43fcd4f0a448d03114c6913aaccea042ff5b919 WatchSource:0}: Error finding container 662a50c621c690c627e3a6cde43fcd4f0a448d03114c6913aaccea042ff5b919: Status 404 returned error can't find the container with id 662a50c621c690c627e3a6cde43fcd4f0a448d03114c6913aaccea042ff5b919 Mar 18 17:07:17.395783 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:17.395765 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:07:18.060644 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:18.060616 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lrgj/must-gather-hz757" event={"ID":"38313260-b7df-4b9a-9a29-78a128b9a617","Type":"ContainerStarted","Data":"662a50c621c690c627e3a6cde43fcd4f0a448d03114c6913aaccea042ff5b919"} Mar 18 17:07:23.077767 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:23.077733 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lrgj/must-gather-hz757" event={"ID":"38313260-b7df-4b9a-9a29-78a128b9a617","Type":"ContainerStarted","Data":"233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553"} Mar 18 17:07:23.077767 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:23.077773 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lrgj/must-gather-hz757" event={"ID":"38313260-b7df-4b9a-9a29-78a128b9a617","Type":"ContainerStarted","Data":"7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc"} Mar 18 17:07:23.094439 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:23.094370 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7lrgj/must-gather-hz757" podStartSLOduration=2.269382598 podStartE2EDuration="7.094354242s" podCreationTimestamp="2026-03-18 17:07:16 +0000 UTC" firstStartedPulling="2026-03-18 17:07:17.39592229 +0000 UTC m=+1407.266546444" lastFinishedPulling="2026-03-18 17:07:22.220893919 +0000 UTC m=+1412.091518088" observedRunningTime="2026-03-18 17:07:23.092246232 +0000 UTC m=+1412.962870409" watchObservedRunningTime="2026-03-18 17:07:23.094354242 +0000 UTC m=+1412.964978418" Mar 18 17:07:40.133710 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:40.133674 2573 generic.go:358] "Generic (PLEG): container finished" podID="38313260-b7df-4b9a-9a29-78a128b9a617" containerID="7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc" exitCode=0 Mar 18 17:07:40.134124 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:40.133747 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7lrgj/must-gather-hz757" event={"ID":"38313260-b7df-4b9a-9a29-78a128b9a617","Type":"ContainerDied","Data":"7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc"} Mar 18 17:07:40.134124 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:40.134089 2573 scope.go:117] "RemoveContainer" containerID="7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc" Mar 18 17:07:40.760448 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:40.760418 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7lrgj_must-gather-hz757_38313260-b7df-4b9a-9a29-78a128b9a617/gather/0.log" Mar 18 17:07:41.306318 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.306288 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vwfl5/must-gather-9m6sd"] Mar 18 17:07:41.309925 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.309909 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwfl5/must-gather-9m6sd" Mar 18 17:07:41.312164 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.312137 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vwfl5\"/\"openshift-service-ca.crt\"" Mar 18 17:07:41.312164 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.312140 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vwfl5\"/\"default-dockercfg-cfrxv\"" Mar 18 17:07:41.312862 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.312844 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vwfl5\"/\"kube-root-ca.crt\"" Mar 18 17:07:41.317130 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.317019 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwfl5/must-gather-9m6sd"] Mar 18 17:07:41.417682 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.417651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a870cfb4-4476-4699-a64d-f79c5871f29a-must-gather-output\") pod \"must-gather-9m6sd\" (UID: \"a870cfb4-4476-4699-a64d-f79c5871f29a\") " pod="openshift-must-gather-vwfl5/must-gather-9m6sd" Mar 18 17:07:41.417856 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.417749 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q62vk\" (UniqueName: \"kubernetes.io/projected/a870cfb4-4476-4699-a64d-f79c5871f29a-kube-api-access-q62vk\") pod \"must-gather-9m6sd\" (UID: \"a870cfb4-4476-4699-a64d-f79c5871f29a\") " pod="openshift-must-gather-vwfl5/must-gather-9m6sd" Mar 18 17:07:41.518968 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.518937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q62vk\" (UniqueName: \"kubernetes.io/projected/a870cfb4-4476-4699-a64d-f79c5871f29a-kube-api-access-q62vk\") pod \"must-gather-9m6sd\" (UID: \"a870cfb4-4476-4699-a64d-f79c5871f29a\") " pod="openshift-must-gather-vwfl5/must-gather-9m6sd" Mar 18 17:07:41.519119 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.518994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a870cfb4-4476-4699-a64d-f79c5871f29a-must-gather-output\") pod \"must-gather-9m6sd\" (UID: \"a870cfb4-4476-4699-a64d-f79c5871f29a\") " pod="openshift-must-gather-vwfl5/must-gather-9m6sd" Mar 18 17:07:41.519299 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.519285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a870cfb4-4476-4699-a64d-f79c5871f29a-must-gather-output\") pod \"must-gather-9m6sd\" (UID: \"a870cfb4-4476-4699-a64d-f79c5871f29a\") " pod="openshift-must-gather-vwfl5/must-gather-9m6sd" Mar 18 17:07:41.526015 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.525994 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q62vk\" (UniqueName: \"kubernetes.io/projected/a870cfb4-4476-4699-a64d-f79c5871f29a-kube-api-access-q62vk\") pod \"must-gather-9m6sd\" (UID: \"a870cfb4-4476-4699-a64d-f79c5871f29a\") " pod="openshift-must-gather-vwfl5/must-gather-9m6sd" Mar 18 17:07:41.619941 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.619908 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwfl5/must-gather-9m6sd" Mar 18 17:07:41.739490 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:41.739432 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwfl5/must-gather-9m6sd"] Mar 18 17:07:42.140589 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:42.140554 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwfl5/must-gather-9m6sd" event={"ID":"a870cfb4-4476-4699-a64d-f79c5871f29a","Type":"ContainerStarted","Data":"371569d2effa8237e72ccf0da4d7b118a133530be9613a9b5b20728db5a245e4"} Mar 18 17:07:43.145930 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:43.145892 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwfl5/must-gather-9m6sd" event={"ID":"a870cfb4-4476-4699-a64d-f79c5871f29a","Type":"ContainerStarted","Data":"4f7d360ef1c8a1026dd41ec7f1ba50a07a7d09d523b979a718ef82073182b22b"} Mar 18 17:07:43.145930 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:43.145931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwfl5/must-gather-9m6sd" event={"ID":"a870cfb4-4476-4699-a64d-f79c5871f29a","Type":"ContainerStarted","Data":"4e7277e76d2edb561bcdb036b93d8cd3541f4d1233e8eadb913f0b5333e9270b"} Mar 18 17:07:43.162682 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:43.162633 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vwfl5/must-gather-9m6sd" podStartSLOduration=1.254449325 podStartE2EDuration="2.162618472s" podCreationTimestamp="2026-03-18 17:07:41 +0000 UTC" firstStartedPulling="2026-03-18 17:07:41.749474302 +0000 UTC m=+1431.620098469" lastFinishedPulling="2026-03-18 17:07:42.657643458 +0000 UTC m=+1432.528267616" observedRunningTime="2026-03-18 17:07:43.160243189 +0000 UTC m=+1433.030867365" watchObservedRunningTime="2026-03-18 17:07:43.162618472 +0000 UTC m=+1433.033242647" Mar 18 17:07:44.138358 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:44.138327 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-sfj8n_bd8bd61d-840f-4e63-af60-47dded682df3/global-pull-secret-syncer/0.log" Mar 18 17:07:44.214116 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:44.214073 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bl97m_53c4dcc2-c3ef-42fa-9b56-1162b7e8fbce/konnectivity-agent/0.log" Mar 18 17:07:44.305648 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:44.305609 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-173.ec2.internal_0b49064290dc49dad3af00f5cc25e86d/haproxy/0.log" Mar 18 17:07:46.140475 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.140431 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7lrgj/must-gather-hz757"] Mar 18 17:07:46.143372 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.143339 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7lrgj/must-gather-hz757"] Mar 18 17:07:46.143687 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.143652 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-7lrgj/must-gather-hz757" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" containerName="copy" containerID="cri-o://233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553" gracePeriod=2 Mar 18 17:07:46.145636 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.145609 2573 status_manager.go:895] "Failed to get status for pod" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" pod="openshift-must-gather-7lrgj/must-gather-hz757" err="pods \"must-gather-hz757\" is forbidden: User \"system:node:ip-10-0-135-173.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7lrgj\": no relationship found between node 'ip-10-0-135-173.ec2.internal' and this object" Mar 18 17:07:46.548655 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.548556 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7lrgj_must-gather-hz757_38313260-b7df-4b9a-9a29-78a128b9a617/copy/0.log" Mar 18 17:07:46.549568 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.549256 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:46.551648 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.551596 2573 status_manager.go:895] "Failed to get status for pod" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" pod="openshift-must-gather-7lrgj/must-gather-hz757" err="pods \"must-gather-hz757\" is forbidden: User \"system:node:ip-10-0-135-173.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-7lrgj\": no relationship found between node 'ip-10-0-135-173.ec2.internal' and this object" Mar 18 17:07:46.671835 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.671795 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/38313260-b7df-4b9a-9a29-78a128b9a617-must-gather-output\") pod \"38313260-b7df-4b9a-9a29-78a128b9a617\" (UID: \"38313260-b7df-4b9a-9a29-78a128b9a617\") " Mar 18 17:07:46.672091 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.671858 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thzlt\" (UniqueName: \"kubernetes.io/projected/38313260-b7df-4b9a-9a29-78a128b9a617-kube-api-access-thzlt\") pod \"38313260-b7df-4b9a-9a29-78a128b9a617\" (UID: \"38313260-b7df-4b9a-9a29-78a128b9a617\") " Mar 18 17:07:46.673515 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.673482 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38313260-b7df-4b9a-9a29-78a128b9a617-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "38313260-b7df-4b9a-9a29-78a128b9a617" (UID: "38313260-b7df-4b9a-9a29-78a128b9a617"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 17:07:46.682754 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.682447 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38313260-b7df-4b9a-9a29-78a128b9a617-kube-api-access-thzlt" (OuterVolumeSpecName: "kube-api-access-thzlt") pod "38313260-b7df-4b9a-9a29-78a128b9a617" (UID: "38313260-b7df-4b9a-9a29-78a128b9a617"). InnerVolumeSpecName "kube-api-access-thzlt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 17:07:46.716077 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.716034 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" path="/var/lib/kubelet/pods/38313260-b7df-4b9a-9a29-78a128b9a617/volumes" Mar 18 17:07:46.775026 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.773105 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thzlt\" (UniqueName: \"kubernetes.io/projected/38313260-b7df-4b9a-9a29-78a128b9a617-kube-api-access-thzlt\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 17:07:46.775026 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:46.773154 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/38313260-b7df-4b9a-9a29-78a128b9a617-must-gather-output\") on node \"ip-10-0-135-173.ec2.internal\" DevicePath \"\"" Mar 18 17:07:47.166966 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.166942 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7lrgj_must-gather-hz757_38313260-b7df-4b9a-9a29-78a128b9a617/copy/0.log" Mar 18 17:07:47.167928 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.167902 2573 generic.go:358] "Generic (PLEG): container finished" podID="38313260-b7df-4b9a-9a29-78a128b9a617" containerID="233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553" exitCode=143 Mar 18 17:07:47.168210 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.168200 2573 scope.go:117] "RemoveContainer" containerID="233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553" Mar 18 17:07:47.168444 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.168432 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7lrgj/must-gather-hz757" Mar 18 17:07:47.182409 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.182373 2573 scope.go:117] "RemoveContainer" containerID="7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc" Mar 18 17:07:47.251984 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.251956 2573 scope.go:117] "RemoveContainer" containerID="233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553" Mar 18 17:07:47.252471 ip-10-0-135-173 kubenswrapper[2573]: E0318 17:07:47.252367 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553\": container with ID starting with 233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553 not found: ID does not exist" containerID="233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553" Mar 18 17:07:47.252598 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.252490 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553"} err="failed to get container status \"233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553\": rpc error: code = NotFound desc = could not find container \"233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553\": container with ID starting with 233fd750cc93c762233344ce6bbea7133bfcde0fff25ac93d7925b219dc3d553 not found: ID does not exist" Mar 18 17:07:47.252598 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.252520 2573 scope.go:117] "RemoveContainer" containerID="7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc" Mar 18 17:07:47.252840 ip-10-0-135-173 kubenswrapper[2573]: E0318 17:07:47.252814 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc\": container with ID starting with 7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc not found: ID does not exist" containerID="7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc" Mar 18 17:07:47.252901 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.252848 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc"} err="failed to get container status \"7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc\": rpc error: code = NotFound desc = could not find container \"7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc\": container with ID starting with 7f12eca67145776220b890501be1658bb390691d2407847ae1e55a6a4f300ecc not found: ID does not exist" Mar 18 17:07:47.847908 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.847886 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_742e1fd3-1ae2-4fe1-90f1-31ac5075418f/alertmanager/0.log" Mar 18 17:07:47.869791 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.869761 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_742e1fd3-1ae2-4fe1-90f1-31ac5075418f/config-reloader/0.log" Mar 18 17:07:47.891818 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.891784 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_742e1fd3-1ae2-4fe1-90f1-31ac5075418f/kube-rbac-proxy-web/0.log" Mar 18 17:07:47.916465 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.916439 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_742e1fd3-1ae2-4fe1-90f1-31ac5075418f/kube-rbac-proxy/0.log" Mar 18 17:07:47.940908 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.940876 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_742e1fd3-1ae2-4fe1-90f1-31ac5075418f/kube-rbac-proxy-metric/0.log" Mar 18 17:07:47.964665 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.964638 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_742e1fd3-1ae2-4fe1-90f1-31ac5075418f/prom-label-proxy/0.log" Mar 18 17:07:47.991326 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:47.991291 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_742e1fd3-1ae2-4fe1-90f1-31ac5075418f/init-config-reloader/0.log" Mar 18 17:07:48.066450 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.066416 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-8pcdj_f1c58dd1-8573-4833-a0b4-fc571c8853cd/kube-state-metrics/0.log" Mar 18 17:07:48.084658 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.084604 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-8pcdj_f1c58dd1-8573-4833-a0b4-fc571c8853cd/kube-rbac-proxy-main/0.log" Mar 18 17:07:48.104773 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.104689 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-8pcdj_f1c58dd1-8573-4833-a0b4-fc571c8853cd/kube-rbac-proxy-self/0.log" Mar 18 17:07:48.133548 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.133515 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5f4d67fd7-9p74k_f23cb088-2776-4ccb-a0bb-7b7ae3587e23/metrics-server/0.log" Mar 18 17:07:48.161449 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.161416 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-6d47bdb78d-l2qp5_77b54a08-71b5-4551-bbc4-dda4f8381c0a/monitoring-plugin/0.log" Mar 18 17:07:48.266261 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.266219 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5jfgb_8b4e9753-8c03-4237-855f-f207bb536cc9/node-exporter/0.log" Mar 18 17:07:48.288358 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.288332 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5jfgb_8b4e9753-8c03-4237-855f-f207bb536cc9/kube-rbac-proxy/0.log" Mar 18 17:07:48.310190 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.310162 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5jfgb_8b4e9753-8c03-4237-855f-f207bb536cc9/init-textfile/0.log" Mar 18 17:07:48.416808 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.416709 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-zhn6r_0121a6ed-4a8e-408b-a0c5-f8ed07b6656c/kube-rbac-proxy-main/0.log" Mar 18 17:07:48.436461 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.436363 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-zhn6r_0121a6ed-4a8e-408b-a0c5-f8ed07b6656c/kube-rbac-proxy-self/0.log" Mar 18 17:07:48.459416 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.459359 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-zhn6r_0121a6ed-4a8e-408b-a0c5-f8ed07b6656c/openshift-state-metrics/0.log" Mar 18 17:07:48.495603 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.495577 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e923bbe4-b772-44cf-8535-fb84dcbc15c0/prometheus/0.log" Mar 18 17:07:48.514193 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.514161 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e923bbe4-b772-44cf-8535-fb84dcbc15c0/config-reloader/0.log" Mar 18 17:07:48.532834 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.532805 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e923bbe4-b772-44cf-8535-fb84dcbc15c0/thanos-sidecar/0.log" Mar 18 17:07:48.551547 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.551518 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e923bbe4-b772-44cf-8535-fb84dcbc15c0/kube-rbac-proxy-web/0.log" Mar 18 17:07:48.575887 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.575857 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e923bbe4-b772-44cf-8535-fb84dcbc15c0/kube-rbac-proxy/0.log" Mar 18 17:07:48.594792 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.594754 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e923bbe4-b772-44cf-8535-fb84dcbc15c0/kube-rbac-proxy-thanos/0.log" Mar 18 17:07:48.614369 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.614311 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e923bbe4-b772-44cf-8535-fb84dcbc15c0/init-config-reloader/0.log" Mar 18 17:07:48.719111 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.719079 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68dd9f748c-blfgv_6f491edb-828b-4cf1-b59c-520b9c8b8073/telemeter-client/0.log" Mar 18 17:07:48.739201 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.739174 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68dd9f748c-blfgv_6f491edb-828b-4cf1-b59c-520b9c8b8073/reload/0.log" Mar 18 17:07:48.773158 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.773133 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68dd9f748c-blfgv_6f491edb-828b-4cf1-b59c-520b9c8b8073/kube-rbac-proxy/0.log" Mar 18 17:07:48.810685 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.810662 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-548bb99f76-vtpmv_b144d91d-2877-4f0b-b92b-f76f99b00d41/thanos-query/0.log" Mar 18 17:07:48.832223 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.832195 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-548bb99f76-vtpmv_b144d91d-2877-4f0b-b92b-f76f99b00d41/kube-rbac-proxy-web/0.log" Mar 18 17:07:48.852547 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.852515 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-548bb99f76-vtpmv_b144d91d-2877-4f0b-b92b-f76f99b00d41/kube-rbac-proxy/0.log" Mar 18 17:07:48.871750 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.871726 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-548bb99f76-vtpmv_b144d91d-2877-4f0b-b92b-f76f99b00d41/prom-label-proxy/0.log" Mar 18 17:07:48.891926 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.891888 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-548bb99f76-vtpmv_b144d91d-2877-4f0b-b92b-f76f99b00d41/kube-rbac-proxy-rules/0.log" Mar 18 17:07:48.913223 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:48.913198 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-548bb99f76-vtpmv_b144d91d-2877-4f0b-b92b-f76f99b00d41/kube-rbac-proxy-metrics/0.log" Mar 18 17:07:50.917070 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:50.917037 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fb95c978c-gxmqd_02f06747-157a-4688-8e56-f6c8837b2273/console/0.log" Mar 18 17:07:51.342503 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.342470 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn"] Mar 18 17:07:51.342844 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.342832 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" containerName="gather" Mar 18 17:07:51.342888 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.342847 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" containerName="gather" Mar 18 17:07:51.342888 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.342858 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" containerName="copy" Mar 18 17:07:51.342888 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.342863 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" containerName="copy" Mar 18 17:07:51.342989 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.342919 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" containerName="gather" Mar 18 17:07:51.342989 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.342930 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="38313260-b7df-4b9a-9a29-78a128b9a617" containerName="copy" Mar 18 17:07:51.347516 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.347496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.356078 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.356057 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn"] Mar 18 17:07:51.425791 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.425753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-proc\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.426061 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.426037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89xhq\" (UniqueName: \"kubernetes.io/projected/500ab18a-93a4-48dd-ab1c-59b2780ede89-kube-api-access-89xhq\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.426202 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.426186 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-sys\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.426331 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.426315 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-lib-modules\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.426455 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.426439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-podres\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.527777 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.527734 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-proc\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.527936 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.527783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89xhq\" (UniqueName: \"kubernetes.io/projected/500ab18a-93a4-48dd-ab1c-59b2780ede89-kube-api-access-89xhq\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.527936 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.527812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-sys\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.527936 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.527828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-lib-modules\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.527936 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.527845 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-podres\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.527936 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.527860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-proc\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.527936 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.527925 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-sys\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.528131 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.527964 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-lib-modules\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.528131 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.527973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/500ab18a-93a4-48dd-ab1c-59b2780ede89-podres\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.536851 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.535852 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89xhq\" (UniqueName: \"kubernetes.io/projected/500ab18a-93a4-48dd-ab1c-59b2780ede89-kube-api-access-89xhq\") pod \"perf-node-gather-daemonset-9bqnn\" (UID: \"500ab18a-93a4-48dd-ab1c-59b2780ede89\") " pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.659454 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.659344 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:51.800922 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.800895 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn"] Mar 18 17:07:51.803191 ip-10-0-135-173 kubenswrapper[2573]: W0318 17:07:51.803160 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod500ab18a_93a4_48dd_ab1c_59b2780ede89.slice/crio-67ecb2c79318ad644f54c601ba2b749080ff374ee8c9a9e35f028652f5010de9 WatchSource:0}: Error finding container 67ecb2c79318ad644f54c601ba2b749080ff374ee8c9a9e35f028652f5010de9: Status 404 returned error can't find the container with id 67ecb2c79318ad644f54c601ba2b749080ff374ee8c9a9e35f028652f5010de9 Mar 18 17:07:51.995233 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:51.995159 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8bzfj_7290ec47-5651-45a6-b07b-cea131daf413/dns/0.log" Mar 18 17:07:52.013566 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:52.013541 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8bzfj_7290ec47-5651-45a6-b07b-cea131daf413/kube-rbac-proxy/0.log" Mar 18 17:07:52.116598 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:52.116570 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5nrz9_8f0f214c-5e39-41f2-90e1-683e89ac4db2/dns-node-resolver/0.log" Mar 18 17:07:52.190811 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:52.190774 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" event={"ID":"500ab18a-93a4-48dd-ab1c-59b2780ede89","Type":"ContainerStarted","Data":"818fb0049eef070d80ddc7c3f47076958e2a31e56850eaa291b8e3c3e59b83ea"} Mar 18 17:07:52.190811 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:52.190818 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" event={"ID":"500ab18a-93a4-48dd-ab1c-59b2780ede89","Type":"ContainerStarted","Data":"67ecb2c79318ad644f54c601ba2b749080ff374ee8c9a9e35f028652f5010de9"} Mar 18 17:07:52.191085 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:52.190938 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:07:52.204854 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:52.204801 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" podStartSLOduration=1.204784127 podStartE2EDuration="1.204784127s" podCreationTimestamp="2026-03-18 17:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:07:52.20445015 +0000 UTC m=+1442.075074327" watchObservedRunningTime="2026-03-18 17:07:52.204784127 +0000 UTC m=+1442.075408304" Mar 18 17:07:52.614851 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:52.614825 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wgm8d_6e6826d5-ecdc-4d2e-97c7-fbe508364d90/node-ca/0.log" Mar 18 17:07:53.628229 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:53.628204 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6wv4w_459ce05a-5191-4973-af7e-9b892245fcdc/serve-healthcheck-canary/0.log" Mar 18 17:07:54.106114 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:54.106087 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-94z6k_8adbd2f9-bd9e-49c8-b447-5e8d4825c56b/kube-rbac-proxy/0.log" Mar 18 17:07:54.124486 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:54.124457 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-94z6k_8adbd2f9-bd9e-49c8-b447-5e8d4825c56b/exporter/0.log" Mar 18 17:07:54.143670 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:54.143643 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-94z6k_8adbd2f9-bd9e-49c8-b447-5e8d4825c56b/extractor/0.log" Mar 18 17:07:56.043915 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:56.043885 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-69d7c9bbdc-8g4s5_bd41a339-2609-4ede-ba5e-c5feefadb0ed/manager/0.log" Mar 18 17:07:56.198402 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:56.198365 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-gdvxs_91a770c1-5a7c-4885-86c2-99635d5626a8/s3-init/0.log" Mar 18 17:07:56.222920 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:56.222890 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-748c497bc-9fhb8_ab5793e7-2304-4e54-ac32-abc53eb88fa7/seaweedfs/0.log" Mar 18 17:07:58.205708 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:07:58.205680 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vwfl5/perf-node-gather-daemonset-9bqnn" Mar 18 17:08:01.101861 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.101818 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4khsw_19928fae-37e8-4123-9e56-7cc4713544ee/kube-multus/0.log" Mar 18 17:08:01.130378 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.130353 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82l6k_4bd66593-b926-4351-809d-710aff145026/kube-multus-additional-cni-plugins/0.log" Mar 18 17:08:01.150369 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.150302 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82l6k_4bd66593-b926-4351-809d-710aff145026/egress-router-binary-copy/0.log" Mar 18 17:08:01.180354 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.180318 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82l6k_4bd66593-b926-4351-809d-710aff145026/cni-plugins/0.log" Mar 18 17:08:01.200251 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.200200 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82l6k_4bd66593-b926-4351-809d-710aff145026/bond-cni-plugin/0.log" Mar 18 17:08:01.224220 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.224191 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82l6k_4bd66593-b926-4351-809d-710aff145026/routeoverride-cni/0.log" Mar 18 17:08:01.244004 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.243977 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82l6k_4bd66593-b926-4351-809d-710aff145026/whereabouts-cni-bincopy/0.log" Mar 18 17:08:01.264258 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.264225 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-82l6k_4bd66593-b926-4351-809d-710aff145026/whereabouts-cni/0.log" Mar 18 17:08:01.638810 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.638781 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-97pm9_f9982fef-c82a-4b1f-8622-337551d7ec32/network-metrics-daemon/0.log" Mar 18 17:08:01.655505 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:01.655478 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-97pm9_f9982fef-c82a-4b1f-8622-337551d7ec32/kube-rbac-proxy/0.log" Mar 18 17:08:02.520922 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:02.520846 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfhr_988c8f30-310e-4643-bf4c-f424b7d7c8ce/ovn-controller/0.log" Mar 18 17:08:02.544149 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:02.544118 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfhr_988c8f30-310e-4643-bf4c-f424b7d7c8ce/ovn-acl-logging/0.log" Mar 18 17:08:02.562082 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:02.562048 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfhr_988c8f30-310e-4643-bf4c-f424b7d7c8ce/kube-rbac-proxy-node/0.log" Mar 18 17:08:02.589095 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:02.589054 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfhr_988c8f30-310e-4643-bf4c-f424b7d7c8ce/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 17:08:02.617252 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:02.617210 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfhr_988c8f30-310e-4643-bf4c-f424b7d7c8ce/northd/0.log" Mar 18 17:08:02.643851 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:02.643824 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfhr_988c8f30-310e-4643-bf4c-f424b7d7c8ce/nbdb/0.log" Mar 18 17:08:02.671992 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:02.671967 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfhr_988c8f30-310e-4643-bf4c-f424b7d7c8ce/sbdb/0.log" Mar 18 17:08:02.772722 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:02.772650 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfhr_988c8f30-310e-4643-bf4c-f424b7d7c8ce/ovnkube-controller/0.log" Mar 18 17:08:04.526132 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:04.526105 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-l99p6_c2cd13a8-578c-4371-b5e8-7ef2af59364b/network-check-target-container/0.log" Mar 18 17:08:05.560332 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:05.560305 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-6dkdp_48146439-50bf-4967-ae3b-86c4c7ea0c9d/iptables-alerter/0.log" Mar 18 17:08:06.269915 ip-10-0-135-173 kubenswrapper[2573]: I0318 17:08:06.269887 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-s92ms_b084c4bf-7986-4b9a-8c58-c374ef1fcf78/tuned/0.log"