Apr 22 19:57:58.366582 ip-10-0-131-194 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:57:58.791472 ip-10-0-131-194 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:58.791472 ip-10-0-131-194 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:57:58.791472 ip-10-0-131-194 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:58.791472 ip-10-0-131-194 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:57:58.791472 ip-10-0-131-194 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:58.792284 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.792171 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:57:58.794448 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794433 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:58.794448 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794447 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794451 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794455 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794458 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794461 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794463 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794466 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794470 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794474 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794478 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794480 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794483 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794486 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794488 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794490 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794500 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794503 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794506 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794509 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:58.794508 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794511 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794514 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794517 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794520 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794523 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794525 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794528 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794531 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794533 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794536 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794538 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794540 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794543 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794545 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794548 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794550 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794553 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794555 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794557 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794560 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:58.794963 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794563 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794566 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794568 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794570 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794573 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794575 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794579 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794583 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794586 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794589 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794591 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794593 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794596 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794598 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794602 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794605 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794607 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794610 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794612 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:58.795510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794614 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794617 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794620 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794622 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794624 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794627 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794629 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794631 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794634 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794636 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794640 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794642 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794645 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794647 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794650 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794652 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794654 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794657 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794659 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794662 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:58.795996 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794664 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794667 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794669 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794672 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794674 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794677 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.794679 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795041 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795046 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795049 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795051 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795054 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795057 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795059 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795062 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795065 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795067 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795070 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795072 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795074 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:58.796498 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795077 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795080 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795082 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795085 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795087 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795090 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795093 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795096 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795098 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795101 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795103 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795106 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795110 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795113 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795116 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795119 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795122 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795125 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795128 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:58.796979 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795132 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795134 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795137 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795140 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795143 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795145 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795149 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795151 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795154 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795157 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795159 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795162 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795164 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795166 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795169 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795171 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795174 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795176 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795178 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795181 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:58.797502 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795183 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795185 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795187 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795190 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795193 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795195 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795197 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795200 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795202 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795205 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795207 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795209 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795213 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795216 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795219 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795221 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795224 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795226 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795229 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795231 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:58.798032 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795233 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795236 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795238 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795241 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795243 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795245 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795262 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795266 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795269 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795271 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795274 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795276 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795278 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795281 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795352 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795360 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795366 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795370 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795375 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795378 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795382 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795387 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:57:58.798548 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795391 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795394 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795402 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795406 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795409 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795412 2580 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795415 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795418 2580 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795421 2580 flags.go:64] FLAG: --cloud-config="" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795423 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795426 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795430 2580 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795432 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795436 2580 flags.go:64] FLAG: --config-dir="" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795438 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795441 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795445 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795448 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795452 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795455 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795458 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795461 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795464 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795467 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795470 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:57:58.799081 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795474 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795477 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795480 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795482 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795485 2580 flags.go:64] FLAG: --enable-server="true" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795488 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795492 2580 flags.go:64] FLAG: --event-burst="100" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795496 2580 flags.go:64] FLAG: --event-qps="50" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795499 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795503 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795506 2580 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795510 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795513 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795516 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795519 2580 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795522 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795525 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795529 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795531 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795534 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795537 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795540 2580 flags.go:64] FLAG: --feature-gates="" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795544 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795547 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795551 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:57:58.799749 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795554 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795557 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795560 2580 flags.go:64] FLAG: --help="false" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795563 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-131-194.ec2.internal" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795566 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795569 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795571 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795575 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795578 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795581 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795583 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795586 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795589 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795592 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795595 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795598 2580 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795602 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795604 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795607 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795610 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795613 2580 flags.go:64] FLAG: --lock-file="" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795615 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795618 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795621 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:57:58.800380 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795626 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795629 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795631 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795634 2580 flags.go:64] FLAG: --logging-format="text" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795637 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795640 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795643 2580 flags.go:64] FLAG: --manifest-url="" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795648 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795661 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795664 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795668 2580 flags.go:64] FLAG: --max-pods="110" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795671 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795674 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795677 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795679 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795682 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795685 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795687 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795699 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795702 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795705 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795708 2580 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795711 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:57:58.800965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795716 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795718 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795722 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795725 2580 flags.go:64] FLAG: --port="10250" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795728 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795731 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e558407d4d17dccd" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795734 2580 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795737 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795740 2580 flags.go:64] FLAG: --register-node="true" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795743 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795746 2580 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795750 2580 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795752 2580 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795755 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795758 2580 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795762 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795766 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795769 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795772 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795774 2580 flags.go:64] FLAG: --runonce="false" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795777 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795780 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795784 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795787 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795789 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795792 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:57:58.801630 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795795 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795798 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795801 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795803 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795806 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795809 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795812 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795815 2580 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795818 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795824 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795826 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795829 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795832 2580 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795835 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795838 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795841 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795843 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795846 2580 flags.go:64] FLAG: --v="2" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795850 2580 flags.go:64] FLAG: --version="false" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795854 2580 flags.go:64] FLAG: --vmodule="" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795858 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.795862 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795967 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795970 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:58.802561 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795973 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795976 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795978 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795981 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795984 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795986 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795989 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795991 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795994 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795996 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.795999 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796001 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796004 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796006 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796009 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796012 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796014 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796017 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796020 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796022 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:58.803385 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796025 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796028 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796030 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796032 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796035 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796038 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796040 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796042 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796045 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796047 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796050 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796053 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796055 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796058 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796060 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796063 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796065 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796067 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796070 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796072 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:58.804002 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796074 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796076 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796079 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796082 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796084 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796087 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796089 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796093 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796097 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796100 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796102 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796105 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796108 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796110 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796113 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796115 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796117 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796120 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796122 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:58.804586 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796125 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796127 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796129 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796133 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796135 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796138 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796140 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796143 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796145 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796148 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796150 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796153 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796155 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796157 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796160 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796162 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796164 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796167 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796170 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796172 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:58.805128 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796175 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796179 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796182 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796186 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.796188 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.796200 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.804280 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.804396 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804452 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804457 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804461 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804464 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804467 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804470 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804473 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:58.805698 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804476 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804479 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804481 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804484 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804487 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804490 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804492 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804495 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804497 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804500 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804502 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804504 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804507 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804509 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804512 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804515 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804517 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804520 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804523 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804525 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:58.806101 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804528 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804530 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804534 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804537 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804541 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804544 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804547 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804549 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804552 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804555 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804557 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804560 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804562 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804565 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804567 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804570 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804572 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804575 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804577 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:58.806623 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804580 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804582 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804585 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804588 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804591 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804595 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804599 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804602 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804605 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804608 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804611 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804614 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804616 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804619 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804622 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804625 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804627 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804631 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804634 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:58.807077 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804636 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804639 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804642 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804644 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804647 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804649 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804652 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804654 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804656 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804659 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804662 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804665 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804667 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804670 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804672 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804674 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804677 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804679 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804682 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804684 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:58.807564 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804687 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.804692 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804788 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804793 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804796 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804799 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804801 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804804 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804806 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804809 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804811 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804814 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804817 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804819 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804822 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804824 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:58.808089 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804827 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804829 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804832 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804834 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804837 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804839 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804843 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804845 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804848 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804851 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804853 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804855 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804858 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804860 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804863 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804865 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804867 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804870 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804872 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:58.808616 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804875 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804877 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804879 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804882 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804884 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804886 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804889 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804892 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804894 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804897 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804899 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804902 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804904 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804907 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804910 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804913 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804915 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804917 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804920 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804922 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:58.809093 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804925 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804927 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804930 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804932 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804935 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804937 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804940 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804942 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804944 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804947 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804949 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804951 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804954 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804956 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804959 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804961 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804964 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804966 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804970 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804974 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:58.809688 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804977 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804980 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804983 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804986 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804990 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804993 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804996 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.804999 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.805001 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.805004 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.805006 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.805009 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:58.805012 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.805016 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:58.810170 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.805812 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:57:58.811263 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.811234 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:57:58.812179 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.812168 2580 server.go:1019] "Starting client certificate rotation" Apr 22 19:57:58.812295 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.812277 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:58.812333 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.812325 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:58.836422 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.836396 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:58.842204 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.842180 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:58.858473 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.858450 2580 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:57:58.864379 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.864360 2580 log.go:25] "Validated CRI v1 image API" Apr 22 19:57:58.865802 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.865781 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:57:58.868786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.868767 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:58.872428 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.872406 2580 fs.go:135] Filesystem UUIDs: map[15c14eb5-1cd3-4a17-9ce3-6b6d1a0a2e37:/dev/nvme0n1p4 575663ef-3331-4582-9395-6b43677b4f60:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 19:57:58.872499 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.872426 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:57:58.880222 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.880102 2580 manager.go:217] Machine: {Timestamp:2026-04-22 19:57:58.878003284 +0000 UTC m=+0.396896355 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3091925 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2541d67622d8bd4f44e8af61ce44cf SystemUUID:ec2541d6-7622-d8bd-4f44-e8af61ce44cf BootID:39b6f423-d0b4-4fc4-9b6e-6923feed6245 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:60:bc:07:07:37 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:60:bc:07:07:37 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:fe:c6:f0:58:1b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:57:58.880222 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.880210 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:57:58.880378 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.880321 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:57:58.881414 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.881380 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:57:58.881605 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.881417 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-194.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:57:58.881652 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.881615 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:57:58.881652 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.881624 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:57:58.881652 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.881637 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:58.882309 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.882298 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:58.883587 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.883577 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:58.883737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.883725 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:57:58.883839 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.883824 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hsc4b" Apr 22 19:57:58.885938 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.885928 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:57:58.886440 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.886430 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:57:58.886471 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.886454 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:57:58.886471 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.886466 2580 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:57:58.886525 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.886477 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:57:58.887559 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.887547 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:58.887672 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.887566 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:58.890591 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.890572 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:57:58.891147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.891132 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hsc4b" Apr 22 19:57:58.892041 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.892027 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:57:58.893702 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893678 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:57:58.893801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893708 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:57:58.893801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893724 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:57:58.893801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893736 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:57:58.893801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893746 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:57:58.893801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893754 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:57:58.893801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893762 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:57:58.893801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893768 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:57:58.893801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893777 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:57:58.894046 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893877 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:57:58.894046 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893901 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:57:58.894046 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.893910 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:57:58.894698 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.894688 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:57:58.894698 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.894698 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:57:58.898887 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.898873 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:57:58.898939 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.898930 2580 server.go:1295] "Started kubelet" Apr 22 19:57:58.899812 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.899714 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:57:58.899899 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.899808 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:57:58.899899 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.899882 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:57:58.900228 ip-10-0-131-194 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:57:58.900507 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.900480 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:58.901568 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.901548 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:57:58.903344 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.902897 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:57:58.904173 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.904144 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-194.ec2.internal" not found Apr 22 19:57:58.905139 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.905119 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:58.907698 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.907636 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:57:58.907698 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.907665 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:58.908835 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.908296 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:57:58.908835 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.908330 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:57:58.908835 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.908336 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:57:58.908835 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.908533 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:57:58.908835 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.908547 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:57:58.908835 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.908608 2580 factory.go:55] Registering systemd factory Apr 22 19:57:58.908835 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.908627 2580 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:57:58.909290 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.909274 2580 factory.go:153] Registering CRI-O factory Apr 22 19:57:58.909351 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.909295 2580 factory.go:223] Registration of the crio container factory successfully Apr 22 19:57:58.909351 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:58.909228 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-194.ec2.internal\" not found" Apr 22 19:57:58.909452 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.909362 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:57:58.909452 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.909400 2580 factory.go:103] Registering Raw factory Apr 22 19:57:58.909538 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.909459 2580 manager.go:1196] Started watching for new ooms in manager Apr 22 19:57:58.910939 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.910918 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:58.911047 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.911024 2580 manager.go:319] Starting recovery of all containers Apr 22 19:57:58.911612 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:58.911594 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:57:58.913206 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:58.913025 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-194.ec2.internal\" not found" node="ip-10-0-131-194.ec2.internal" Apr 22 19:57:58.918562 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.918542 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-194.ec2.internal" not found Apr 22 19:57:58.921739 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.921720 2580 manager.go:324] Recovery completed Apr 22 19:57:58.926236 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.926224 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:58.928988 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.928974 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-194.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:58.929058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.929001 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:58.929058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.929010 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-194.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:58.929500 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.929488 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:57:58.929587 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.929501 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:57:58.929587 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.929529 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:58.932043 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.932028 2580 policy_none.go:49] "None policy: Start" Apr 22 19:57:58.932111 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.932047 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:57:58.932111 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.932060 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:57:58.974423 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.974398 2580 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-194.ec2.internal" not found Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.979752 2580 manager.go:341] "Starting Device Plugin manager" Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:58.979779 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.979789 2580 server.go:85] "Starting device plugin registration server" Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.980050 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.980064 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.980186 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.980288 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:58.980298 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:58.981036 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:57:58.994936 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:58.981088 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-194.ec2.internal\" not found" Apr 22 19:57:59.062149 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.062072 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:57:59.063552 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.063533 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:57:59.063644 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.063563 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:57:59.063644 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.063610 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:57:59.063644 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.063618 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:57:59.063978 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:59.063660 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:57:59.065737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.065714 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:59.080561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.080534 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:59.081526 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.081512 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-194.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:59.081583 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.081542 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:59.081583 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.081557 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-194.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:59.081583 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.081580 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.087485 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.087471 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.087529 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:59.087495 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-194.ec2.internal\": node \"ip-10-0-131-194.ec2.internal\" not found" Apr 22 19:57:59.164381 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.164320 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal"] Apr 22 19:57:59.167262 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.167235 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.167383 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.167238 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.183828 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.183805 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.187237 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.187219 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.197722 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.197706 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:59.207172 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.207156 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:59.309828 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.309800 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/004125ff8da455671b80f86ae1638e89-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal\" (UID: \"004125ff8da455671b80f86ae1638e89\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.309828 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.309829 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/004125ff8da455671b80f86ae1638e89-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal\" (UID: \"004125ff8da455671b80f86ae1638e89\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.310005 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.309851 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d94f1ae0823126e44c89a34cb4f19534-config\") pod \"kube-apiserver-proxy-ip-10-0-131-194.ec2.internal\" (UID: \"d94f1ae0823126e44c89a34cb4f19534\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.410464 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.410396 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d94f1ae0823126e44c89a34cb4f19534-config\") pod \"kube-apiserver-proxy-ip-10-0-131-194.ec2.internal\" (UID: \"d94f1ae0823126e44c89a34cb4f19534\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.410464 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.410427 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/004125ff8da455671b80f86ae1638e89-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal\" (UID: \"004125ff8da455671b80f86ae1638e89\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.410464 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.410446 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/004125ff8da455671b80f86ae1638e89-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal\" (UID: \"004125ff8da455671b80f86ae1638e89\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.410644 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.410513 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d94f1ae0823126e44c89a34cb4f19534-config\") pod \"kube-apiserver-proxy-ip-10-0-131-194.ec2.internal\" (UID: \"d94f1ae0823126e44c89a34cb4f19534\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.410644 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.410519 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/004125ff8da455671b80f86ae1638e89-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal\" (UID: \"004125ff8da455671b80f86ae1638e89\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.410644 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.410525 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/004125ff8da455671b80f86ae1638e89-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal\" (UID: \"004125ff8da455671b80f86ae1638e89\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.500659 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.500615 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.509524 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.509494 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" Apr 22 19:57:59.812345 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.812307 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:57:59.813056 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.812472 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:59.813056 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.812509 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:59.813056 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.812529 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:59.887353 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.887320 2580 apiserver.go:52] "Watching apiserver" Apr 22 19:57:59.893266 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.893229 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:52:58 +0000 UTC" deadline="2027-09-17 18:29:25.149249758 +0000 UTC" Apr 22 19:57:59.893336 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.893265 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12310h31m25.255986807s" Apr 22 19:57:59.894660 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.894642 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:57:59.896012 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.895987 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-6gkjl","openshift-image-registry/node-ca-46mjj","openshift-network-diagnostics/network-check-target-l4pll","openshift-network-operator/iptables-alerter-z79rl","kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal","openshift-multus/multus-additional-cni-plugins-rsqjt","openshift-multus/multus-bmfgl","openshift-multus/network-metrics-daemon-fjgnl","openshift-ovn-kubernetes/ovnkube-node-wxp4m","kube-system/konnectivity-agent-q4kgc","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6"] Apr 22 19:57:59.899768 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.899548 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.900673 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.900652 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:57:59.902050 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.902031 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:59.902138 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.902094 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:57:59.902219 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:59.902173 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:57:59.902219 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.902035 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:59.902219 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.902028 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8kl9w\"" Apr 22 19:57:59.902352 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.902188 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:57:59.902901 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.902878 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:57:59.902901 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.902897 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:57:59.903045 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.903022 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:57:59.903045 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.903032 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qqmvh\"" Apr 22 19:57:59.903868 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.903848 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:57:59.904503 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.904474 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:57:59.904725 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.904703 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7bzm\"" Apr 22 19:57:59.905593 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.905180 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:59.905702 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.905612 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:59.905878 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.905856 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:57:59.906598 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.906579 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:57:59.906674 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.906639 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:57:59.907516 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.907170 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:57:59.907516 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.907332 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.907516 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.907458 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:57:59.907774 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.907755 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:59.907913 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.907891 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cx7zt\"" Apr 22 19:57:59.909604 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.909588 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:57:59.909687 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:57:59.909671 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:57:59.909771 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.909754 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.909895 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.909882 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mflnj\"" Apr 22 19:57:59.910516 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.910500 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:57:59.911190 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.911178 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:57:59.912350 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912332 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:57:59.912445 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912384 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-var-lib-openvswitch\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.912445 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912417 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-log-socket\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.912555 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912442 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t95bv\" (UniqueName: \"kubernetes.io/projected/1296444e-df43-4223-beb7-c3de3946d7a7-kube-api-access-t95bv\") pod \"node-ca-46mjj\" (UID: \"1296444e-df43-4223-beb7-c3de3946d7a7\") " pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:57:59.912555 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912467 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-modprobe-d\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.912555 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912492 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqr9d\" (UniqueName: \"kubernetes.io/projected/b1bb13ae-9672-47e1-89b9-7a095040d199-kube-api-access-dqr9d\") pod \"iptables-alerter-z79rl\" (UID: \"b1bb13ae-9672-47e1-89b9-7a095040d199\") " pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:57:59.912555 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912515 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:57:59.912555 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912536 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912540 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-var-lib-cni-multus\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912585 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-var-lib-kubelet\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912611 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-sys\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912635 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912651 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912636 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-systemd-units\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13033536-961c-41e0-a8b1-73ef9eb5c983-ovnkube-script-lib\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912743 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912749 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4qb8\" (UniqueName: \"kubernetes.io/projected/13033536-961c-41e0-a8b1-73ef9eb5c983-kube-api-access-q4qb8\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.912786 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912756 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912800 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2fhf\" (UniqueName: \"kubernetes.io/projected/41c7e8c9-1a30-478c-8609-17a08d4db06c-kube-api-access-s2fhf\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912811 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jmxb4\"" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912837 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-var-lib-cni-bin\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912866 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-etc-kubernetes\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912890 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-host\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912915 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13033536-961c-41e0-a8b1-73ef9eb5c983-ovn-node-metrics-cert\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912942 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/41c7e8c9-1a30-478c-8609-17a08d4db06c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912965 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-run-systemd\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.912981 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13033536-961c-41e0-a8b1-73ef9eb5c983-env-overrides\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913009 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1296444e-df43-4223-beb7-c3de3946d7a7-serviceca\") pod \"node-ca-46mjj\" (UID: \"1296444e-df43-4223-beb7-c3de3946d7a7\") " pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913035 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-cnibin\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913052 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-daemon-config\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913066 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-sysctl-d\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913081 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913094 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-slash\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913109 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-socket-dir-parent\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.913231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913151 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-node-log\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913171 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13033536-961c-41e0-a8b1-73ef9eb5c983-ovnkube-config\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913193 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/41c7e8c9-1a30-478c-8609-17a08d4db06c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913212 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-run-netns\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-run\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913240 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znssr\" (UniqueName: \"kubernetes.io/projected/ece6b521-94c5-4509-8a90-439f6a926c6b-kube-api-access-znssr\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913281 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5k9\" (UniqueName: \"kubernetes.io/projected/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-kube-api-access-rv5k9\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-run-netns\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913348 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-run-ovn\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-run-ovn-kubernetes\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913401 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-cnibin\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913424 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-system-cni-dir\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913447 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-cni-binary-copy\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913462 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-sysconfig\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913478 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b1bb13ae-9672-47e1-89b9-7a095040d199-host-slash\") pod \"iptables-alerter-z79rl\" (UID: \"b1bb13ae-9672-47e1-89b9-7a095040d199\") " pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913500 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-etc-openvswitch\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914058 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913537 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-cni-bin\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913594 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1296444e-df43-4223-beb7-c3de3946d7a7-host\") pod \"node-ca-46mjj\" (UID: \"1296444e-df43-4223-beb7-c3de3946d7a7\") " pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913621 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-os-release\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913639 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-run-multus-certs\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913663 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-sysctl-conf\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913681 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913694 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-cni-netd\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913727 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-os-release\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913761 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-var-lib-kubelet\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913784 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-run-openvswitch\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913807 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-cni-dir\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913829 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-kubernetes\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913856 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-systemd\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913899 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-kubelet\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913921 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-system-cni-dir\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913956 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-conf-dir\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913976 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ece6b521-94c5-4509-8a90-439f6a926c6b-tmp\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.914706 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.913991 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914009 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914037 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914040 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41c7e8c9-1a30-478c-8609-17a08d4db06c-cni-binary-copy\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914095 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-lib-modules\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914120 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-tuned\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914126 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914147 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-run-k8s-cni-cncf-io\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914178 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-hostroot\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914210 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xh49\" (UniqueName: \"kubernetes.io/projected/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-kube-api-access-7xh49\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b1bb13ae-9672-47e1-89b9-7a095040d199-iptables-alerter-script\") pod \"iptables-alerter-z79rl\" (UID: \"b1bb13ae-9672-47e1-89b9-7a095040d199\") " pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:57:59.915470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.914348 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-4pxgt\"" Apr 22 19:57:59.915865 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.915847 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:57:59.916071 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.916051 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:57:59.916146 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.916091 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-f8gcl\"" Apr 22 19:57:59.916146 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.916099 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:57:59.922865 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.922846 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:59.947008 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.946988 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-85bmd" Apr 22 19:57:59.954638 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.954619 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-85bmd" Apr 22 19:57:59.971579 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:59.971549 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod004125ff8da455671b80f86ae1638e89.slice/crio-7b9584dedef664c0e46236b1d894ca615f01d2bbc4b013f2cc0b275b075b5026 WatchSource:0}: Error finding container 7b9584dedef664c0e46236b1d894ca615f01d2bbc4b013f2cc0b275b075b5026: Status 404 returned error can't find the container with id 7b9584dedef664c0e46236b1d894ca615f01d2bbc4b013f2cc0b275b075b5026 Apr 22 19:57:59.971788 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:57:59.971767 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd94f1ae0823126e44c89a34cb4f19534.slice/crio-424c451e8cba69cb99e116dad6d386a619444b371e2dcd7ac73a05cd7777c528 WatchSource:0}: Error finding container 424c451e8cba69cb99e116dad6d386a619444b371e2dcd7ac73a05cd7777c528: Status 404 returned error can't find the container with id 424c451e8cba69cb99e116dad6d386a619444b371e2dcd7ac73a05cd7777c528 Apr 22 19:57:59.976333 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:57:59.976315 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:58:00.009030 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.009007 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:58:00.014805 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014786 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-run-netns\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.014864 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014815 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-run-ovn\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.014864 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014833 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-run-ovn-kubernetes\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.014864 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014849 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-cnibin\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.014967 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014864 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-system-cni-dir\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.014967 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-cni-binary-copy\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.014967 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014897 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-run-netns\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.014967 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014912 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-sysconfig\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.014967 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014898 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-run-ovn\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.014967 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014925 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-run-ovn-kubernetes\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.014967 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014934 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-cnibin\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.014967 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014955 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b1bb13ae-9672-47e1-89b9-7a095040d199-host-slash\") pod \"iptables-alerter-z79rl\" (UID: \"b1bb13ae-9672-47e1-89b9-7a095040d199\") " pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:58:00.014967 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014963 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-sysconfig\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-etc-openvswitch\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014983 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-system-cni-dir\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b1bb13ae-9672-47e1-89b9-7a095040d199-host-slash\") pod \"iptables-alerter-z79rl\" (UID: \"b1bb13ae-9672-47e1-89b9-7a095040d199\") " pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.014999 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-cni-bin\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015020 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-cni-bin\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015031 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1296444e-df43-4223-beb7-c3de3946d7a7-host\") pod \"node-ca-46mjj\" (UID: \"1296444e-df43-4223-beb7-c3de3946d7a7\") " pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015045 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-etc-openvswitch\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015055 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1296444e-df43-4223-beb7-c3de3946d7a7-host\") pod \"node-ca-46mjj\" (UID: \"1296444e-df43-4223-beb7-c3de3946d7a7\") " pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-os-release\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-run-multus-certs\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015107 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-sysctl-conf\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015146 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-run-multus-certs\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015149 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-cni-netd\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015169 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-cni-netd\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015176 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-os-release\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015189 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-os-release\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015224 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-var-lib-kubelet\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.015320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015278 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-run-openvswitch\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015304 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-cni-dir\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015318 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-var-lib-kubelet\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015329 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-kubernetes\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015316 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-os-release\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015368 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-sysctl-conf\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015375 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-systemd\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015386 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-kubernetes\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015392 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-run-openvswitch\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015409 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-systemd\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-kubelet\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015449 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-cni-dir\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhd7m\" (UniqueName: \"kubernetes.io/projected/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-kube-api-access-vhd7m\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015453 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-kubelet\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-system-cni-dir\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-conf-dir\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015562 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-system-cni-dir\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015579 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ece6b521-94c5-4509-8a90-439f6a926c6b-tmp\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.016167 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015605 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-conf-dir\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015661 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015729 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41c7e8c9-1a30-478c-8609-17a08d4db06c-cni-binary-copy\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015786 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-lib-modules\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015810 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-tuned\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015859 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-run-k8s-cni-cncf-io\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015888 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-hostroot\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015874 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015915 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xh49\" (UniqueName: \"kubernetes.io/projected/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-kube-api-access-7xh49\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015905 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015944 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b1bb13ae-9672-47e1-89b9-7a095040d199-iptables-alerter-script\") pod \"iptables-alerter-z79rl\" (UID: \"b1bb13ae-9672-47e1-89b9-7a095040d199\") " pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015953 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-run-k8s-cni-cncf-io\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.015997 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-var-lib-openvswitch\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016009 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-lib-modules\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016013 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-hostroot\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.017011 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-log-socket\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016053 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-log-socket\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016059 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t95bv\" (UniqueName: \"kubernetes.io/projected/1296444e-df43-4223-beb7-c3de3946d7a7-kube-api-access-t95bv\") pod \"node-ca-46mjj\" (UID: \"1296444e-df43-4223-beb7-c3de3946d7a7\") " pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016205 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-modprobe-d\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016234 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqr9d\" (UniqueName: \"kubernetes.io/projected/b1bb13ae-9672-47e1-89b9-7a095040d199-kube-api-access-dqr9d\") pod \"iptables-alerter-z79rl\" (UID: \"b1bb13ae-9672-47e1-89b9-7a095040d199\") " pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016291 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-var-lib-cni-multus\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016365 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-var-lib-kubelet\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016368 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-modprobe-d\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-sys\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016412 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-systemd-units\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016438 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13033536-961c-41e0-a8b1-73ef9eb5c983-ovnkube-script-lib\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016464 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-device-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016482 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/41c7e8c9-1a30-478c-8609-17a08d4db06c-cni-binary-copy\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016488 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-cni-binary-copy\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016508 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b1bb13ae-9672-47e1-89b9-7a095040d199-iptables-alerter-script\") pod \"iptables-alerter-z79rl\" (UID: \"b1bb13ae-9672-47e1-89b9-7a095040d199\") " pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016494 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4qb8\" (UniqueName: \"kubernetes.io/projected/13033536-961c-41e0-a8b1-73ef9eb5c983-kube-api-access-q4qb8\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.017842 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016540 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-systemd-units\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016492 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-var-lib-kubelet\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016570 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4ecbe0e-65f8-404d-a158-29f98b2705f1-agent-certs\") pod \"konnectivity-agent-q4kgc\" (UID: \"f4ecbe0e-65f8-404d-a158-29f98b2705f1\") " pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016598 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-sys-fs\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016655 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-var-lib-cni-multus\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016685 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2fhf\" (UniqueName: \"kubernetes.io/projected/41c7e8c9-1a30-478c-8609-17a08d4db06c-kube-api-access-s2fhf\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016737 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-var-lib-cni-bin\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016742 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-var-lib-openvswitch\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-etc-kubernetes\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016809 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-var-lib-cni-bin\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016822 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-host\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016851 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13033536-961c-41e0-a8b1-73ef9eb5c983-ovn-node-metrics-cert\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016893 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-etc-kubernetes\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016907 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/41c7e8c9-1a30-478c-8609-17a08d4db06c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016916 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-sys\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.016939 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-host\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017020 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.018566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017056 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-socket-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017087 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/41c7e8c9-1a30-478c-8609-17a08d4db06c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017103 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13033536-961c-41e0-a8b1-73ef9eb5c983-ovnkube-script-lib\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017117 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-run-systemd\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13033536-961c-41e0-a8b1-73ef9eb5c983-env-overrides\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017164 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-run-systemd\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017170 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4ecbe0e-65f8-404d-a158-29f98b2705f1-konnectivity-ca\") pod \"konnectivity-agent-q4kgc\" (UID: \"f4ecbe0e-65f8-404d-a158-29f98b2705f1\") " pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017195 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-registration-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017223 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1296444e-df43-4223-beb7-c3de3946d7a7-serviceca\") pod \"node-ca-46mjj\" (UID: \"1296444e-df43-4223-beb7-c3de3946d7a7\") " pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017267 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-cnibin\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017296 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-daemon-config\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017320 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-sysctl-d\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017342 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017364 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-slash\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-socket-dir-parent\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017412 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-node-log\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017434 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13033536-961c-41e0-a8b1-73ef9eb5c983-ovnkube-config\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019124 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017458 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/41c7e8c9-1a30-478c-8609-17a08d4db06c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-run-netns\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017525 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-run\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017549 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znssr\" (UniqueName: \"kubernetes.io/projected/ece6b521-94c5-4509-8a90-439f6a926c6b-kube-api-access-znssr\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017573 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5k9\" (UniqueName: \"kubernetes.io/projected/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-kube-api-access-rv5k9\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017607 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13033536-961c-41e0-a8b1-73ef9eb5c983-env-overrides\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017719 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-socket-dir-parent\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017772 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-node-log\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017771 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-cnibin\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017801 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/41c7e8c9-1a30-478c-8609-17a08d4db06c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1296444e-df43-4223-beb7-c3de3946d7a7-serviceca\") pod \"node-ca-46mjj\" (UID: \"1296444e-df43-4223-beb7-c3de3946d7a7\") " pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.017913 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13033536-961c-41e0-a8b1-73ef9eb5c983-host-slash\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.017962 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.018027 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-sysctl-d\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.018058 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs podName:9641a5d7-3e56-4f40-97db-ff0e3d5cb321 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:00.518005583 +0000 UTC m=+2.036898626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs") pod "network-metrics-daemon-fjgnl" (UID: "9641a5d7-3e56-4f40-97db-ff0e3d5cb321") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.018079 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-host-run-netns\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.018215 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ece6b521-94c5-4509-8a90-439f6a926c6b-run\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.018276 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13033536-961c-41e0-a8b1-73ef9eb5c983-ovnkube-config\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.019656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.018318 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-multus-daemon-config\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.020154 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.019038 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/41c7e8c9-1a30-478c-8609-17a08d4db06c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.020154 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.019067 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ece6b521-94c5-4509-8a90-439f6a926c6b-etc-tuned\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.020154 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.019087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ece6b521-94c5-4509-8a90-439f6a926c6b-tmp\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.020154 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.019438 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13033536-961c-41e0-a8b1-73ef9eb5c983-ovn-node-metrics-cert\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.021613 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.021599 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:00.021666 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.021615 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:00.021666 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.021624 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9qnhv for pod openshift-network-diagnostics/network-check-target-l4pll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:00.021666 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.021665 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv podName:0eaeb73f-d4a2-4a3a-8997-fd78247676aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:00.52165307 +0000 UTC m=+2.040546083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9qnhv" (UniqueName: "kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv") pod "network-check-target-l4pll" (UID: "0eaeb73f-d4a2-4a3a-8997-fd78247676aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:00.023864 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.023837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t95bv\" (UniqueName: \"kubernetes.io/projected/1296444e-df43-4223-beb7-c3de3946d7a7-kube-api-access-t95bv\") pod \"node-ca-46mjj\" (UID: \"1296444e-df43-4223-beb7-c3de3946d7a7\") " pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:58:00.024149 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.024130 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xh49\" (UniqueName: \"kubernetes.io/projected/e460f2a4-02bb-4b8c-9775-8e03b6c0e88e-kube-api-access-7xh49\") pod \"multus-bmfgl\" (UID: \"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e\") " pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.024663 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.024646 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqr9d\" (UniqueName: \"kubernetes.io/projected/b1bb13ae-9672-47e1-89b9-7a095040d199-kube-api-access-dqr9d\") pod \"iptables-alerter-z79rl\" (UID: \"b1bb13ae-9672-47e1-89b9-7a095040d199\") " pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:58:00.026910 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.026891 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5k9\" (UniqueName: \"kubernetes.io/projected/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-kube-api-access-rv5k9\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:00.027044 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.027026 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znssr\" (UniqueName: \"kubernetes.io/projected/ece6b521-94c5-4509-8a90-439f6a926c6b-kube-api-access-znssr\") pod \"tuned-6gkjl\" (UID: \"ece6b521-94c5-4509-8a90-439f6a926c6b\") " pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.027369 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.027350 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4qb8\" (UniqueName: \"kubernetes.io/projected/13033536-961c-41e0-a8b1-73ef9eb5c983-kube-api-access-q4qb8\") pod \"ovnkube-node-wxp4m\" (UID: \"13033536-961c-41e0-a8b1-73ef9eb5c983\") " pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.030639 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.030618 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2fhf\" (UniqueName: \"kubernetes.io/projected/41c7e8c9-1a30-478c-8609-17a08d4db06c-kube-api-access-s2fhf\") pod \"multus-additional-cni-plugins-rsqjt\" (UID: \"41c7e8c9-1a30-478c-8609-17a08d4db06c\") " pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.067292 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.067201 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" event={"ID":"004125ff8da455671b80f86ae1638e89","Type":"ContainerStarted","Data":"7b9584dedef664c0e46236b1d894ca615f01d2bbc4b013f2cc0b275b075b5026"} Apr 22 19:58:00.068147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.068128 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal" event={"ID":"d94f1ae0823126e44c89a34cb4f19534","Type":"ContainerStarted","Data":"424c451e8cba69cb99e116dad6d386a619444b371e2dcd7ac73a05cd7777c528"} Apr 22 19:58:00.117978 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.117945 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhd7m\" (UniqueName: \"kubernetes.io/projected/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-kube-api-access-vhd7m\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118153 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.117991 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118153 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-device-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118153 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118042 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4ecbe0e-65f8-404d-a158-29f98b2705f1-agent-certs\") pod \"konnectivity-agent-q4kgc\" (UID: \"f4ecbe0e-65f8-404d-a158-29f98b2705f1\") " pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:00.118153 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118058 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-sys-fs\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118153 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118084 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118153 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118111 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-device-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118153 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118116 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-socket-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118449 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118163 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118449 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118175 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-sys-fs\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118449 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118162 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-etc-selinux\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118449 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118188 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4ecbe0e-65f8-404d-a158-29f98b2705f1-konnectivity-ca\") pod \"konnectivity-agent-q4kgc\" (UID: \"f4ecbe0e-65f8-404d-a158-29f98b2705f1\") " pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:00.118449 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118218 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-registration-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118449 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118289 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-socket-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118449 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118290 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-registration-dir\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.118669 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.118653 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4ecbe0e-65f8-404d-a158-29f98b2705f1-konnectivity-ca\") pod \"konnectivity-agent-q4kgc\" (UID: \"f4ecbe0e-65f8-404d-a158-29f98b2705f1\") " pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:00.120468 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.120451 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4ecbe0e-65f8-404d-a158-29f98b2705f1-agent-certs\") pod \"konnectivity-agent-q4kgc\" (UID: \"f4ecbe0e-65f8-404d-a158-29f98b2705f1\") " pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:00.126978 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.126961 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhd7m\" (UniqueName: \"kubernetes.io/projected/a88b8c5a-53fe-4054-ae62-90e3a194c1b5-kube-api-access-vhd7m\") pod \"aws-ebs-csi-driver-node-j7cn6\" (UID: \"a88b8c5a-53fe-4054-ae62-90e3a194c1b5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.226093 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.226062 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" Apr 22 19:58:00.233353 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:00.233327 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podece6b521_94c5_4509_8a90_439f6a926c6b.slice/crio-f2a57b4e29dec4776bf07fd904ce6e16446876973cacb5178e6b2a6ac5bd4a5f WatchSource:0}: Error finding container f2a57b4e29dec4776bf07fd904ce6e16446876973cacb5178e6b2a6ac5bd4a5f: Status 404 returned error can't find the container with id f2a57b4e29dec4776bf07fd904ce6e16446876973cacb5178e6b2a6ac5bd4a5f Apr 22 19:58:00.236861 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.236807 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-46mjj" Apr 22 19:58:00.242982 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:00.242958 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1296444e_df43_4223_beb7_c3de3946d7a7.slice/crio-5fd68eb46fddb7f72aeb9d637adb2114ca79894d31d87bca26da118019b1b785 WatchSource:0}: Error finding container 5fd68eb46fddb7f72aeb9d637adb2114ca79894d31d87bca26da118019b1b785: Status 404 returned error can't find the container with id 5fd68eb46fddb7f72aeb9d637adb2114ca79894d31d87bca26da118019b1b785 Apr 22 19:58:00.254012 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.253994 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z79rl" Apr 22 19:58:00.259604 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:00.259584 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1bb13ae_9672_47e1_89b9_7a095040d199.slice/crio-667c7346cea6b44bc9cde101e4618f404226cdc0d899a6c663b0265bad5e471b WatchSource:0}: Error finding container 667c7346cea6b44bc9cde101e4618f404226cdc0d899a6c663b0265bad5e471b: Status 404 returned error can't find the container with id 667c7346cea6b44bc9cde101e4618f404226cdc0d899a6c663b0265bad5e471b Apr 22 19:58:00.278194 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.278171 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" Apr 22 19:58:00.283898 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:00.283876 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c7e8c9_1a30_478c_8609_17a08d4db06c.slice/crio-51216a18019bce7623ffe6d4ad7a9fb740b6bec698fae78c1cb7a21224fa1788 WatchSource:0}: Error finding container 51216a18019bce7623ffe6d4ad7a9fb740b6bec698fae78c1cb7a21224fa1788: Status 404 returned error can't find the container with id 51216a18019bce7623ffe6d4ad7a9fb740b6bec698fae78c1cb7a21224fa1788 Apr 22 19:58:00.284450 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.284435 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bmfgl" Apr 22 19:58:00.290553 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.290534 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:00.290650 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:00.290575 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode460f2a4_02bb_4b8c_9775_8e03b6c0e88e.slice/crio-5706cb764a9b5a865eb38d5a3628776e9a43e451fac3f270eb319da87bb67c72 WatchSource:0}: Error finding container 5706cb764a9b5a865eb38d5a3628776e9a43e451fac3f270eb319da87bb67c72: Status 404 returned error can't find the container with id 5706cb764a9b5a865eb38d5a3628776e9a43e451fac3f270eb319da87bb67c72 Apr 22 19:58:00.296797 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:00.296770 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13033536_961c_41e0_a8b1_73ef9eb5c983.slice/crio-6e71d8c03b1c06f43d3736ba9c1295af2614a68bb39b49a118387aa35cd1011e WatchSource:0}: Error finding container 6e71d8c03b1c06f43d3736ba9c1295af2614a68bb39b49a118387aa35cd1011e: Status 404 returned error can't find the container with id 6e71d8c03b1c06f43d3736ba9c1295af2614a68bb39b49a118387aa35cd1011e Apr 22 19:58:00.296901 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.296830 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:00.301393 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.301372 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" Apr 22 19:58:00.304129 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:00.304102 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4ecbe0e_65f8_404d_a158_29f98b2705f1.slice/crio-ca4173e6668cef53e636f200b5d74b7f08a6ded076469d7c9976dd8b192d6b64 WatchSource:0}: Error finding container ca4173e6668cef53e636f200b5d74b7f08a6ded076469d7c9976dd8b192d6b64: Status 404 returned error can't find the container with id ca4173e6668cef53e636f200b5d74b7f08a6ded076469d7c9976dd8b192d6b64 Apr 22 19:58:00.309825 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:00.309749 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda88b8c5a_53fe_4054_ae62_90e3a194c1b5.slice/crio-0ddc70236a13a977c9752bb01fd7b0800b97491f8bbf840d0efd4e7c0c0ff74a WatchSource:0}: Error finding container 0ddc70236a13a977c9752bb01fd7b0800b97491f8bbf840d0efd4e7c0c0ff74a: Status 404 returned error can't find the container with id 0ddc70236a13a977c9752bb01fd7b0800b97491f8bbf840d0efd4e7c0c0ff74a Apr 22 19:58:00.522033 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.521981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:00.522190 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.522055 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:00.522190 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.522159 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:00.522318 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.522222 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs podName:9641a5d7-3e56-4f40-97db-ff0e3d5cb321 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:01.522202356 +0000 UTC m=+3.041095388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs") pod "network-metrics-daemon-fjgnl" (UID: "9641a5d7-3e56-4f40-97db-ff0e3d5cb321") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:00.522459 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.522433 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:00.522535 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.522466 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:00.522535 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.522482 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9qnhv for pod openshift-network-diagnostics/network-check-target-l4pll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:00.522652 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:00.522536 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv podName:0eaeb73f-d4a2-4a3a-8997-fd78247676aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:01.522518666 +0000 UTC m=+3.041411693 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9qnhv" (UniqueName: "kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv") pod "network-check-target-l4pll" (UID: "0eaeb73f-d4a2-4a3a-8997-fd78247676aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:00.938763 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.938729 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:00.955940 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.955846 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:59 +0000 UTC" deadline="2028-01-28 20:14:33.035928414 +0000 UTC" Apr 22 19:58:00.955940 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:00.955884 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15504h16m32.080047731s" Apr 22 19:58:01.087023 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.086985 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" event={"ID":"a88b8c5a-53fe-4054-ae62-90e3a194c1b5","Type":"ContainerStarted","Data":"0ddc70236a13a977c9752bb01fd7b0800b97491f8bbf840d0efd4e7c0c0ff74a"} Apr 22 19:58:01.108387 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.108283 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q4kgc" event={"ID":"f4ecbe0e-65f8-404d-a158-29f98b2705f1","Type":"ContainerStarted","Data":"ca4173e6668cef53e636f200b5d74b7f08a6ded076469d7c9976dd8b192d6b64"} Apr 22 19:58:01.116512 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.116478 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" event={"ID":"41c7e8c9-1a30-478c-8609-17a08d4db06c","Type":"ContainerStarted","Data":"51216a18019bce7623ffe6d4ad7a9fb740b6bec698fae78c1cb7a21224fa1788"} Apr 22 19:58:01.121682 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.121648 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-46mjj" event={"ID":"1296444e-df43-4223-beb7-c3de3946d7a7","Type":"ContainerStarted","Data":"5fd68eb46fddb7f72aeb9d637adb2114ca79894d31d87bca26da118019b1b785"} Apr 22 19:58:01.137166 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.137130 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" event={"ID":"ece6b521-94c5-4509-8a90-439f6a926c6b","Type":"ContainerStarted","Data":"f2a57b4e29dec4776bf07fd904ce6e16446876973cacb5178e6b2a6ac5bd4a5f"} Apr 22 19:58:01.143126 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.143099 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:01.144861 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.144761 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" event={"ID":"13033536-961c-41e0-a8b1-73ef9eb5c983","Type":"ContainerStarted","Data":"6e71d8c03b1c06f43d3736ba9c1295af2614a68bb39b49a118387aa35cd1011e"} Apr 22 19:58:01.152591 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.152562 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bmfgl" event={"ID":"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e","Type":"ContainerStarted","Data":"5706cb764a9b5a865eb38d5a3628776e9a43e451fac3f270eb319da87bb67c72"} Apr 22 19:58:01.157043 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.157009 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z79rl" event={"ID":"b1bb13ae-9672-47e1-89b9-7a095040d199","Type":"ContainerStarted","Data":"667c7346cea6b44bc9cde101e4618f404226cdc0d899a6c663b0265bad5e471b"} Apr 22 19:58:01.268241 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.268160 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:01.530452 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.530370 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:01.530452 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.530447 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:01.530659 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:01.530560 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:01.530659 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:01.530569 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:01.530659 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:01.530586 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:01.530659 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:01.530599 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9qnhv for pod openshift-network-diagnostics/network-check-target-l4pll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:01.530659 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:01.530632 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs podName:9641a5d7-3e56-4f40-97db-ff0e3d5cb321 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:03.530614765 +0000 UTC m=+5.049507784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs") pod "network-metrics-daemon-fjgnl" (UID: "9641a5d7-3e56-4f40-97db-ff0e3d5cb321") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:01.530659 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:01.530655 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv podName:0eaeb73f-d4a2-4a3a-8997-fd78247676aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:03.530638723 +0000 UTC m=+5.049531749 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9qnhv" (UniqueName: "kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv") pod "network-check-target-l4pll" (UID: "0eaeb73f-d4a2-4a3a-8997-fd78247676aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:01.956320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.956200 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:59 +0000 UTC" deadline="2027-12-19 11:35:14.263252388 +0000 UTC" Apr 22 19:58:01.956320 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:01.956266 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14535h37m12.307008062s" Apr 22 19:58:02.064816 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.064771 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:02.064988 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:02.064893 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:02.065290 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.065271 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:02.065389 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:02.065349 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:02.841613 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.840760 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2ss6g"] Apr 22 19:58:02.843241 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.842638 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:02.849193 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.848493 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:58:02.859473 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.859220 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gxmxw\"" Apr 22 19:58:02.865318 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.861625 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:58:02.939551 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.939464 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0df19ae2-f6f9-432d-bbda-8015b4504723-hosts-file\") pod \"node-resolver-2ss6g\" (UID: \"0df19ae2-f6f9-432d-bbda-8015b4504723\") " pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:02.939551 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.939514 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0df19ae2-f6f9-432d-bbda-8015b4504723-tmp-dir\") pod \"node-resolver-2ss6g\" (UID: \"0df19ae2-f6f9-432d-bbda-8015b4504723\") " pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:02.939804 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:02.939569 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gkzv\" (UniqueName: \"kubernetes.io/projected/0df19ae2-f6f9-432d-bbda-8015b4504723-kube-api-access-9gkzv\") pod \"node-resolver-2ss6g\" (UID: \"0df19ae2-f6f9-432d-bbda-8015b4504723\") " pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:03.040876 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:03.040813 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0df19ae2-f6f9-432d-bbda-8015b4504723-hosts-file\") pod \"node-resolver-2ss6g\" (UID: \"0df19ae2-f6f9-432d-bbda-8015b4504723\") " pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:03.040876 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:03.040868 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0df19ae2-f6f9-432d-bbda-8015b4504723-tmp-dir\") pod \"node-resolver-2ss6g\" (UID: \"0df19ae2-f6f9-432d-bbda-8015b4504723\") " pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:03.041405 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:03.040924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gkzv\" (UniqueName: \"kubernetes.io/projected/0df19ae2-f6f9-432d-bbda-8015b4504723-kube-api-access-9gkzv\") pod \"node-resolver-2ss6g\" (UID: \"0df19ae2-f6f9-432d-bbda-8015b4504723\") " pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:03.041405 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:03.041342 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0df19ae2-f6f9-432d-bbda-8015b4504723-hosts-file\") pod \"node-resolver-2ss6g\" (UID: \"0df19ae2-f6f9-432d-bbda-8015b4504723\") " pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:03.041685 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:03.041662 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0df19ae2-f6f9-432d-bbda-8015b4504723-tmp-dir\") pod \"node-resolver-2ss6g\" (UID: \"0df19ae2-f6f9-432d-bbda-8015b4504723\") " pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:03.051894 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:03.051835 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gkzv\" (UniqueName: \"kubernetes.io/projected/0df19ae2-f6f9-432d-bbda-8015b4504723-kube-api-access-9gkzv\") pod \"node-resolver-2ss6g\" (UID: \"0df19ae2-f6f9-432d-bbda-8015b4504723\") " pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:03.162870 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:03.162841 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2ss6g" Apr 22 19:58:03.547737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:03.546911 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:03.547737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:03.546985 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:03.547737 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:03.547144 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:03.547737 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:03.547158 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:03.547737 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:03.547169 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9qnhv for pod openshift-network-diagnostics/network-check-target-l4pll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:03.547737 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:03.547223 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv podName:0eaeb73f-d4a2-4a3a-8997-fd78247676aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:07.547207668 +0000 UTC m=+9.066100681 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9qnhv" (UniqueName: "kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv") pod "network-check-target-l4pll" (UID: "0eaeb73f-d4a2-4a3a-8997-fd78247676aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:03.547737 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:03.547629 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:03.547737 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:03.547670 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs podName:9641a5d7-3e56-4f40-97db-ff0e3d5cb321 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:07.547658256 +0000 UTC m=+9.066551269 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs") pod "network-metrics-daemon-fjgnl" (UID: "9641a5d7-3e56-4f40-97db-ff0e3d5cb321") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:04.063888 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:04.063852 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:04.064416 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:04.063852 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:04.064416 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:04.064004 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:04.064416 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:04.064060 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:06.065218 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:06.064411 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:06.065218 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:06.064560 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:06.065218 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:06.065043 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:06.065218 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:06.065161 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:07.580981 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:07.580922 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:07.581413 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:07.581084 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:07.581413 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:07.581159 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs podName:9641a5d7-3e56-4f40-97db-ff0e3d5cb321 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:15.581138706 +0000 UTC m=+17.100031721 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs") pod "network-metrics-daemon-fjgnl" (UID: "9641a5d7-3e56-4f40-97db-ff0e3d5cb321") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:07.581524 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:07.581489 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:07.581703 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:07.581683 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:07.581806 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:07.581708 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:07.581806 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:07.581731 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9qnhv for pod openshift-network-diagnostics/network-check-target-l4pll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:07.581806 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:07.581797 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv podName:0eaeb73f-d4a2-4a3a-8997-fd78247676aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:15.581773064 +0000 UTC m=+17.100666080 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9qnhv" (UniqueName: "kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv") pod "network-check-target-l4pll" (UID: "0eaeb73f-d4a2-4a3a-8997-fd78247676aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:08.064471 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:08.063843 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:08.064471 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:08.063972 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:08.064471 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:08.063847 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:08.064471 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:08.064427 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:10.064779 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:10.064746 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:10.064779 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:10.064775 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:10.065316 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:10.064869 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:10.065316 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:10.065018 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:12.064368 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:12.064323 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:12.064839 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:12.064323 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:12.064839 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:12.064465 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:12.064839 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:12.064578 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:14.064412 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:14.064375 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:14.064945 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:14.064375 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:14.064945 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:14.064533 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:14.064945 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:14.064602 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:15.637651 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:15.637592 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:15.638090 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:15.637665 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:15.638090 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:15.637747 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:15.638090 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:15.637791 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:15.638090 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:15.637813 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:15.638090 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:15.637825 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9qnhv for pod openshift-network-diagnostics/network-check-target-l4pll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:15.638090 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:15.637804 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs podName:9641a5d7-3e56-4f40-97db-ff0e3d5cb321 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:31.637787874 +0000 UTC m=+33.156680888 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs") pod "network-metrics-daemon-fjgnl" (UID: "9641a5d7-3e56-4f40-97db-ff0e3d5cb321") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:15.638090 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:15.637874 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv podName:0eaeb73f-d4a2-4a3a-8997-fd78247676aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:31.637862309 +0000 UTC m=+33.156755325 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9qnhv" (UniqueName: "kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv") pod "network-check-target-l4pll" (UID: "0eaeb73f-d4a2-4a3a-8997-fd78247676aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:16.063830 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:16.063798 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:16.064020 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:16.063798 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:16.064020 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:16.063915 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:16.064020 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:16.063969 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:17.996540 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:17.996512 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0df19ae2_f6f9_432d_bbda_8015b4504723.slice/crio-636425ffb1670e13482d7cb3d0577eaca66ad7ed11eaefafeabe8481b8a3d33c WatchSource:0}: Error finding container 636425ffb1670e13482d7cb3d0577eaca66ad7ed11eaefafeabe8481b8a3d33c: Status 404 returned error can't find the container with id 636425ffb1670e13482d7cb3d0577eaca66ad7ed11eaefafeabe8481b8a3d33c Apr 22 19:58:18.064133 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:18.064111 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:18.064212 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:18.064107 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:18.064309 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:18.064216 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:18.064396 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:18.064378 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:18.184269 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:18.184221 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2ss6g" event={"ID":"0df19ae2-f6f9-432d-bbda-8015b4504723","Type":"ContainerStarted","Data":"636425ffb1670e13482d7cb3d0577eaca66ad7ed11eaefafeabe8481b8a3d33c"} Apr 22 19:58:19.189295 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.188741 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" event={"ID":"ece6b521-94c5-4509-8a90-439f6a926c6b","Type":"ContainerStarted","Data":"7a78c7ec0dab207c4a56c08cfe21145a50a8e33a354d2026abdd9aa88f963e8f"} Apr 22 19:58:19.193209 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.192972 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" event={"ID":"13033536-961c-41e0-a8b1-73ef9eb5c983","Type":"ContainerStarted","Data":"be444004f6ea6d2a2d880994cc8af6a963afb54a2ec15920a36d0912e576bcaf"} Apr 22 19:58:19.193209 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.193009 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" event={"ID":"13033536-961c-41e0-a8b1-73ef9eb5c983","Type":"ContainerStarted","Data":"d06003dad4b77637a7ec283c2618a35a775f957ebcb136f09de324364282096a"} Apr 22 19:58:19.193209 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.193023 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" event={"ID":"13033536-961c-41e0-a8b1-73ef9eb5c983","Type":"ContainerStarted","Data":"225bf3b657d8bcff3c0374f46d9af0bc38230b900b855a7c8313d4da939ee5a7"} Apr 22 19:58:19.193209 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.193036 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" event={"ID":"13033536-961c-41e0-a8b1-73ef9eb5c983","Type":"ContainerStarted","Data":"e3537191147844ee565d1ed4ded8c70b71705e9963759e71c1f9250b6ddbe056"} Apr 22 19:58:19.193209 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.193044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" event={"ID":"13033536-961c-41e0-a8b1-73ef9eb5c983","Type":"ContainerStarted","Data":"c1817e48c3d5c7dc5e986717b3492b86a46add0f8b8ccf3adea23bb218cce998"} Apr 22 19:58:19.193209 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.193053 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" event={"ID":"13033536-961c-41e0-a8b1-73ef9eb5c983","Type":"ContainerStarted","Data":"144e0ade7c115e3044640de0a062088702033c220f99566bf43797d7206a35b8"} Apr 22 19:58:19.194343 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.194245 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bmfgl" event={"ID":"e460f2a4-02bb-4b8c-9775-8e03b6c0e88e","Type":"ContainerStarted","Data":"169cfa5be8d016f3a3716c1f01a5cca711fb3b778eaf0c529791ae15a83c1c76"} Apr 22 19:58:19.195289 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.195208 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z79rl" event={"ID":"b1bb13ae-9672-47e1-89b9-7a095040d199","Type":"ContainerStarted","Data":"ade39b63674f39cb5edbe5018fcfe97ebada8cec67e3bbde49a52524c13924c8"} Apr 22 19:58:19.196438 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.196418 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal" event={"ID":"d94f1ae0823126e44c89a34cb4f19534","Type":"ContainerStarted","Data":"292efadb4aa893c56d0358ac99a3f03bb92584aaae89cabfa966731a33670513"} Apr 22 19:58:19.208269 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.208211 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6gkjl" podStartSLOduration=2.455233189 podStartE2EDuration="20.208179403s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:00.237287536 +0000 UTC m=+1.756180563" lastFinishedPulling="2026-04-22 19:58:17.99023375 +0000 UTC m=+19.509126777" observedRunningTime="2026-04-22 19:58:19.208172815 +0000 UTC m=+20.727065875" watchObservedRunningTime="2026-04-22 19:58:19.208179403 +0000 UTC m=+20.727072429" Apr 22 19:58:19.235046 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.233574 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bmfgl" podStartSLOduration=2.4966883859999998 podStartE2EDuration="20.233556331s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:00.292095232 +0000 UTC m=+1.810988245" lastFinishedPulling="2026-04-22 19:58:18.028963177 +0000 UTC m=+19.547856190" observedRunningTime="2026-04-22 19:58:19.233443781 +0000 UTC m=+20.752336817" watchObservedRunningTime="2026-04-22 19:58:19.233556331 +0000 UTC m=+20.752449365" Apr 22 19:58:19.249280 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.248912 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-z79rl" podStartSLOduration=2.55158753 podStartE2EDuration="20.248897043s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:00.261521646 +0000 UTC m=+1.780414662" lastFinishedPulling="2026-04-22 19:58:17.958831146 +0000 UTC m=+19.477724175" observedRunningTime="2026-04-22 19:58:19.24861567 +0000 UTC m=+20.767508705" watchObservedRunningTime="2026-04-22 19:58:19.248897043 +0000 UTC m=+20.767790080" Apr 22 19:58:19.262088 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:19.262005 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-194.ec2.internal" podStartSLOduration=20.261987624 podStartE2EDuration="20.261987624s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:19.261011804 +0000 UTC m=+20.779904849" watchObservedRunningTime="2026-04-22 19:58:19.261987624 +0000 UTC m=+20.780880661" Apr 22 19:58:20.064306 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.064282 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:20.064419 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.064282 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:20.064419 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:20.064383 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:20.064490 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:20.064443 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:20.184165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.184144 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:58:20.199746 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.199721 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" event={"ID":"a88b8c5a-53fe-4054-ae62-90e3a194c1b5","Type":"ContainerStarted","Data":"ebbe5e59747cc0660c15d9329988c50b8146d315ffff9845c438572ed838ba8f"} Apr 22 19:58:20.200175 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.199755 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" event={"ID":"a88b8c5a-53fe-4054-ae62-90e3a194c1b5","Type":"ContainerStarted","Data":"ff5547c6ee645afc85beb7ad01d38b3e80182532b239a55ea625a9e69770f698"} Apr 22 19:58:20.200946 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.200924 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q4kgc" event={"ID":"f4ecbe0e-65f8-404d-a158-29f98b2705f1","Type":"ContainerStarted","Data":"ff3ebe0df4d5c47dfae49c0bb3ecd424fb4a2b0e07829a541caebc8c6dc00f3c"} Apr 22 19:58:20.202231 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.202200 2580 generic.go:358] "Generic (PLEG): container finished" podID="41c7e8c9-1a30-478c-8609-17a08d4db06c" containerID="2f8f7acbe31cdfaefedf84123ea817c54c7b8f5e0aa68d67a7e350c502363dc1" exitCode=0 Apr 22 19:58:20.202332 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.202276 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" event={"ID":"41c7e8c9-1a30-478c-8609-17a08d4db06c","Type":"ContainerDied","Data":"2f8f7acbe31cdfaefedf84123ea817c54c7b8f5e0aa68d67a7e350c502363dc1"} Apr 22 19:58:20.203554 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.203442 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-46mjj" event={"ID":"1296444e-df43-4223-beb7-c3de3946d7a7","Type":"ContainerStarted","Data":"77fb88ed5207a149c306f1357fdff10eda5e81e1e3824353654e1c181052ef1d"} Apr 22 19:58:20.204948 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.204924 2580 generic.go:358] "Generic (PLEG): container finished" podID="004125ff8da455671b80f86ae1638e89" containerID="ad345ab47d39f8e1603b8d6045c8dd471177710f02a6fe6740efc8ceb08eb77d" exitCode=0 Apr 22 19:58:20.205010 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.204994 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" event={"ID":"004125ff8da455671b80f86ae1638e89","Type":"ContainerDied","Data":"ad345ab47d39f8e1603b8d6045c8dd471177710f02a6fe6740efc8ceb08eb77d"} Apr 22 19:58:20.206287 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.206265 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2ss6g" event={"ID":"0df19ae2-f6f9-432d-bbda-8015b4504723","Type":"ContainerStarted","Data":"824175e88a297d559e6282629e5602dea61201f81874d29578842dcf1854d817"} Apr 22 19:58:20.217002 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.216963 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q4kgc" podStartSLOduration=3.530371866 podStartE2EDuration="21.216950318s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:00.306374827 +0000 UTC m=+1.825267841" lastFinishedPulling="2026-04-22 19:58:17.992953266 +0000 UTC m=+19.511846293" observedRunningTime="2026-04-22 19:58:20.216744006 +0000 UTC m=+21.735637051" watchObservedRunningTime="2026-04-22 19:58:20.216950318 +0000 UTC m=+21.735843352" Apr 22 19:58:20.231733 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.231694 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2ss6g" podStartSLOduration=18.231680273 podStartE2EDuration="18.231680273s" podCreationTimestamp="2026-04-22 19:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:20.231361553 +0000 UTC m=+21.750254588" watchObservedRunningTime="2026-04-22 19:58:20.231680273 +0000 UTC m=+21.750573308" Apr 22 19:58:20.264910 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.264847 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-46mjj" podStartSLOduration=3.518889137 podStartE2EDuration="21.264830679s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:00.24442305 +0000 UTC m=+1.763316063" lastFinishedPulling="2026-04-22 19:58:17.990364585 +0000 UTC m=+19.509257605" observedRunningTime="2026-04-22 19:58:20.264622181 +0000 UTC m=+21.783515217" watchObservedRunningTime="2026-04-22 19:58:20.264830679 +0000 UTC m=+21.783723716" Apr 22 19:58:20.995312 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.994971 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:58:20.184162729Z","UUID":"aa71e0af-9c18-4dd8-b60e-85211dbd9d63","Handler":null,"Name":"","Endpoint":""} Apr 22 19:58:20.997176 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.996937 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:58:20.997176 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:20.996976 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:58:21.210900 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:21.210822 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" event={"ID":"a88b8c5a-53fe-4054-ae62-90e3a194c1b5","Type":"ContainerStarted","Data":"0f05509cbcfbeb11374b763d802fa0705f0fd84c92e371f425d30b0bb80583b6"} Apr 22 19:58:21.213052 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:21.213021 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" event={"ID":"004125ff8da455671b80f86ae1638e89","Type":"ContainerStarted","Data":"cc9838031479223c272eb3621dd457ea0fcd37dff2f01d55a99d9ff1e68bbd57"} Apr 22 19:58:21.216436 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:21.216410 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" event={"ID":"13033536-961c-41e0-a8b1-73ef9eb5c983","Type":"ContainerStarted","Data":"37f42d5b1895d6723e225926e9c374d1ee68036989419abe16660ed2ed0c87c2"} Apr 22 19:58:21.227649 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:21.227595 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-j7cn6" podStartSLOduration=1.588179413 podStartE2EDuration="22.227581147s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:00.311633914 +0000 UTC m=+1.830526932" lastFinishedPulling="2026-04-22 19:58:20.951035638 +0000 UTC m=+22.469928666" observedRunningTime="2026-04-22 19:58:21.227519344 +0000 UTC m=+22.746412380" watchObservedRunningTime="2026-04-22 19:58:21.227581147 +0000 UTC m=+22.746474184" Apr 22 19:58:21.971475 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:21.971437 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:21.972100 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:21.972076 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:21.986348 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:21.986299 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-194.ec2.internal" podStartSLOduration=22.986286997 podStartE2EDuration="22.986286997s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:21.241656062 +0000 UTC m=+22.760549096" watchObservedRunningTime="2026-04-22 19:58:21.986286997 +0000 UTC m=+23.505180032" Apr 22 19:58:22.063931 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:22.063891 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:22.064114 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:22.064003 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:22.064186 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:22.064116 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:22.064273 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:22.064239 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:22.218553 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:22.218510 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:22.219119 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:22.218902 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q4kgc" Apr 22 19:58:24.064482 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.064285 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:24.065048 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.064285 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:24.065048 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:24.064583 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:24.065048 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:24.064676 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:24.223383 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.223347 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" event={"ID":"41c7e8c9-1a30-478c-8609-17a08d4db06c","Type":"ContainerStarted","Data":"d5c929bb071f67991d8a86d722a712c741e45b468aebd8aafb4782f4017b7260"} Apr 22 19:58:24.227402 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.227317 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" event={"ID":"13033536-961c-41e0-a8b1-73ef9eb5c983","Type":"ContainerStarted","Data":"ffca0d1892c8b79d4632615495d27585d5df2cda5ff83830ee0bc5306287f3e9"} Apr 22 19:58:24.228086 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.228062 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:24.228086 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.228090 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:24.228269 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.228102 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:24.244407 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.244383 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:24.244551 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.244459 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:24.271173 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:24.271132 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" podStartSLOduration=6.993406094 podStartE2EDuration="25.271118642s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:00.298700338 +0000 UTC m=+1.817593352" lastFinishedPulling="2026-04-22 19:58:18.576412886 +0000 UTC m=+20.095305900" observedRunningTime="2026-04-22 19:58:24.270711137 +0000 UTC m=+25.789604172" watchObservedRunningTime="2026-04-22 19:58:24.271118642 +0000 UTC m=+25.790011676" Apr 22 19:58:25.230035 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:25.229998 2580 generic.go:358] "Generic (PLEG): container finished" podID="41c7e8c9-1a30-478c-8609-17a08d4db06c" containerID="d5c929bb071f67991d8a86d722a712c741e45b468aebd8aafb4782f4017b7260" exitCode=0 Apr 22 19:58:25.230488 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:25.230081 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" event={"ID":"41c7e8c9-1a30-478c-8609-17a08d4db06c","Type":"ContainerDied","Data":"d5c929bb071f67991d8a86d722a712c741e45b468aebd8aafb4782f4017b7260"} Apr 22 19:58:26.048922 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:26.048600 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l4pll"] Apr 22 19:58:26.049019 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:26.048968 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:26.049095 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:26.049070 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:26.049326 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:26.049297 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fjgnl"] Apr 22 19:58:26.049413 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:26.049401 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:26.049511 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:26.049494 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:26.234151 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:26.234117 2580 generic.go:358] "Generic (PLEG): container finished" podID="41c7e8c9-1a30-478c-8609-17a08d4db06c" containerID="9a01e65e884f88378ffca974ea56bd06604967f93dfd75792e45470d2b4c77c1" exitCode=0 Apr 22 19:58:26.234566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:26.234205 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" event={"ID":"41c7e8c9-1a30-478c-8609-17a08d4db06c","Type":"ContainerDied","Data":"9a01e65e884f88378ffca974ea56bd06604967f93dfd75792e45470d2b4c77c1"} Apr 22 19:58:27.238241 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:27.238190 2580 generic.go:358] "Generic (PLEG): container finished" podID="41c7e8c9-1a30-478c-8609-17a08d4db06c" containerID="87c9f4a52486f8c45ebc661db2420124ed21df8492c8d85b76c35ff7766f98f8" exitCode=0 Apr 22 19:58:27.238587 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:27.238292 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" event={"ID":"41c7e8c9-1a30-478c-8609-17a08d4db06c","Type":"ContainerDied","Data":"87c9f4a52486f8c45ebc661db2420124ed21df8492c8d85b76c35ff7766f98f8"} Apr 22 19:58:28.064778 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:28.064744 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:28.064952 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:28.064744 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:28.064952 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:28.064886 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:28.064952 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:28.064937 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:28.446505 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:28.446476 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2ss6g_0df19ae2-f6f9-432d-bbda-8015b4504723/dns-node-resolver/0.log" Apr 22 19:58:29.425545 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:29.425515 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-46mjj_1296444e-df43-4223-beb7-c3de3946d7a7/node-ca/0.log" Apr 22 19:58:30.064860 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:30.064391 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:30.064860 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:30.064423 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:30.064860 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:30.064522 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l4pll" podUID="0eaeb73f-d4a2-4a3a-8997-fd78247676aa" Apr 22 19:58:30.064860 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:30.064649 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fjgnl" podUID="9641a5d7-3e56-4f40-97db-ff0e3d5cb321" Apr 22 19:58:31.294801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:31.294717 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-194.ec2.internal" event="NodeReady" Apr 22 19:58:31.295368 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:31.294867 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:58:31.651802 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:31.651713 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:31.651802 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:31.651780 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:31.652008 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:31.651915 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:31.652008 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:31.651973 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs podName:9641a5d7-3e56-4f40-97db-ff0e3d5cb321 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:03.651958228 +0000 UTC m=+65.170851242 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs") pod "network-metrics-daemon-fjgnl" (UID: "9641a5d7-3e56-4f40-97db-ff0e3d5cb321") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:31.652268 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:31.652224 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:31.652413 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:31.652277 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:31.652413 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:31.652294 2580 projected.go:194] Error preparing data for projected volume kube-api-access-9qnhv for pod openshift-network-diagnostics/network-check-target-l4pll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:31.652413 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:58:31.652350 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv podName:0eaeb73f-d4a2-4a3a-8997-fd78247676aa nodeName:}" failed. No retries permitted until 2026-04-22 19:59:03.652333466 +0000 UTC m=+65.171226493 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9qnhv" (UniqueName: "kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv") pod "network-check-target-l4pll" (UID: "0eaeb73f-d4a2-4a3a-8997-fd78247676aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:32.064153 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:32.064111 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:58:32.064443 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:32.064122 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:58:32.068105 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:32.068080 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxcj4\"" Apr 22 19:58:32.068227 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:32.068080 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjpls\"" Apr 22 19:58:32.068227 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:32.068178 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:58:32.068227 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:32.068184 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:58:32.068227 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:32.068124 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:58:34.253442 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:34.253227 2580 generic.go:358] "Generic (PLEG): container finished" podID="41c7e8c9-1a30-478c-8609-17a08d4db06c" containerID="a1f75217e40ab04fd69c35772690c158d841cbd322a6180cd63991eb5d767b2e" exitCode=0 Apr 22 19:58:34.253920 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:34.253300 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" event={"ID":"41c7e8c9-1a30-478c-8609-17a08d4db06c","Type":"ContainerDied","Data":"a1f75217e40ab04fd69c35772690c158d841cbd322a6180cd63991eb5d767b2e"} Apr 22 19:58:35.258247 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:35.258205 2580 generic.go:358] "Generic (PLEG): container finished" podID="41c7e8c9-1a30-478c-8609-17a08d4db06c" containerID="c80d1eb36ab3a3ce4ca0a64568f67888a2ec1cf25d11b57c87417cce0a0b9f2b" exitCode=0 Apr 22 19:58:35.258645 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:35.258273 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" event={"ID":"41c7e8c9-1a30-478c-8609-17a08d4db06c","Type":"ContainerDied","Data":"c80d1eb36ab3a3ce4ca0a64568f67888a2ec1cf25d11b57c87417cce0a0b9f2b"} Apr 22 19:58:36.263082 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:36.263045 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" event={"ID":"41c7e8c9-1a30-478c-8609-17a08d4db06c","Type":"ContainerStarted","Data":"812924a89a5068a6339043256ce9c50bda35e43d742caa0177311049bc0847d6"} Apr 22 19:58:36.290224 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:36.290165 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rsqjt" podStartSLOduration=4.429398757 podStartE2EDuration="37.290149713s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:00.285419613 +0000 UTC m=+1.804312640" lastFinishedPulling="2026-04-22 19:58:33.146170561 +0000 UTC m=+34.665063596" observedRunningTime="2026-04-22 19:58:36.288670091 +0000 UTC m=+37.807563125" watchObservedRunningTime="2026-04-22 19:58:36.290149713 +0000 UTC m=+37.809042749" Apr 22 19:58:51.097157 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.097120 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9w8kc"] Apr 22 19:58:51.100462 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.100445 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.104852 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.104832 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:58:51.104852 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.104840 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h76wn\"" Apr 22 19:58:51.105798 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.105779 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:58:51.105893 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.105779 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:58:51.105893 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.105831 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:58:51.112505 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.112486 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9w8kc"] Apr 22 19:58:51.113829 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.113810 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wvnxp"] Apr 22 19:58:51.117186 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.117168 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wvnxp" Apr 22 19:58:51.122683 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.122664 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:58:51.122790 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.122678 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:58:51.122790 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.122707 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xt4d8\"" Apr 22 19:58:51.122985 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.122969 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:58:51.133681 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.133662 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wvnxp"] Apr 22 19:58:51.184136 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.184103 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca734b69-52bc-4ae7-9171-1860a1388b9f-cert\") pod \"ingress-canary-wvnxp\" (UID: \"ca734b69-52bc-4ae7-9171-1860a1388b9f\") " pod="openshift-ingress-canary/ingress-canary-wvnxp" Apr 22 19:58:51.184314 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.184142 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qn4w\" (UniqueName: \"kubernetes.io/projected/ca734b69-52bc-4ae7-9171-1860a1388b9f-kube-api-access-7qn4w\") pod \"ingress-canary-wvnxp\" (UID: \"ca734b69-52bc-4ae7-9171-1860a1388b9f\") " pod="openshift-ingress-canary/ingress-canary-wvnxp" Apr 22 19:58:51.184314 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.184203 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.184314 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.184246 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-crio-socket\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.184314 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.184283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-data-volume\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.184314 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.184304 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.184480 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.184334 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbt2p\" (UniqueName: \"kubernetes.io/projected/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-kube-api-access-fbt2p\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.214112 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.214080 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gh454"] Apr 22 19:58:51.217486 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.217469 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.219920 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.219898 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:58:51.220021 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.219969 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9r99g\"" Apr 22 19:58:51.220021 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.219992 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:58:51.226144 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.226121 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gh454"] Apr 22 19:58:51.285068 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285034 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qn4w\" (UniqueName: \"kubernetes.io/projected/ca734b69-52bc-4ae7-9171-1860a1388b9f-kube-api-access-7qn4w\") pod \"ingress-canary-wvnxp\" (UID: \"ca734b69-52bc-4ae7-9171-1860a1388b9f\") " pod="openshift-ingress-canary/ingress-canary-wvnxp" Apr 22 19:58:51.285236 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285084 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.285236 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285112 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-crio-socket\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.285236 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285128 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-data-volume\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.285236 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285154 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.285236 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285180 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsppg\" (UniqueName: \"kubernetes.io/projected/ffd7b823-9cf8-4e86-ac80-22981e293e06-kube-api-access-dsppg\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.285236 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285223 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbt2p\" (UniqueName: \"kubernetes.io/projected/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-kube-api-access-fbt2p\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.285595 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285270 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd7b823-9cf8-4e86-ac80-22981e293e06-config-volume\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.285595 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285358 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ffd7b823-9cf8-4e86-ac80-22981e293e06-metrics-tls\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.285595 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285370 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-crio-socket\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.285595 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285472 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ffd7b823-9cf8-4e86-ac80-22981e293e06-tmp-dir\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.285595 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285504 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca734b69-52bc-4ae7-9171-1860a1388b9f-cert\") pod \"ingress-canary-wvnxp\" (UID: \"ca734b69-52bc-4ae7-9171-1860a1388b9f\") " pod="openshift-ingress-canary/ingress-canary-wvnxp" Apr 22 19:58:51.285595 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285541 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-data-volume\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.286023 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.285999 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.289600 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.289580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.289700 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.289635 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca734b69-52bc-4ae7-9171-1860a1388b9f-cert\") pod \"ingress-canary-wvnxp\" (UID: \"ca734b69-52bc-4ae7-9171-1860a1388b9f\") " pod="openshift-ingress-canary/ingress-canary-wvnxp" Apr 22 19:58:51.296056 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.296027 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qn4w\" (UniqueName: \"kubernetes.io/projected/ca734b69-52bc-4ae7-9171-1860a1388b9f-kube-api-access-7qn4w\") pod \"ingress-canary-wvnxp\" (UID: \"ca734b69-52bc-4ae7-9171-1860a1388b9f\") " pod="openshift-ingress-canary/ingress-canary-wvnxp" Apr 22 19:58:51.296961 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.296945 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbt2p\" (UniqueName: \"kubernetes.io/projected/c16ab73e-ca33-4cd7-bef1-c05fb49576cf-kube-api-access-fbt2p\") pod \"insights-runtime-extractor-9w8kc\" (UID: \"c16ab73e-ca33-4cd7-bef1-c05fb49576cf\") " pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.386122 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.386036 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsppg\" (UniqueName: \"kubernetes.io/projected/ffd7b823-9cf8-4e86-ac80-22981e293e06-kube-api-access-dsppg\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.386122 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.386083 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd7b823-9cf8-4e86-ac80-22981e293e06-config-volume\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.386122 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.386104 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ffd7b823-9cf8-4e86-ac80-22981e293e06-metrics-tls\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.386422 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.386138 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ffd7b823-9cf8-4e86-ac80-22981e293e06-tmp-dir\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.386552 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.386531 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ffd7b823-9cf8-4e86-ac80-22981e293e06-tmp-dir\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.386673 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.386646 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ffd7b823-9cf8-4e86-ac80-22981e293e06-config-volume\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.388544 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.388524 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ffd7b823-9cf8-4e86-ac80-22981e293e06-metrics-tls\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.394285 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.394243 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsppg\" (UniqueName: \"kubernetes.io/projected/ffd7b823-9cf8-4e86-ac80-22981e293e06-kube-api-access-dsppg\") pod \"dns-default-gh454\" (UID: \"ffd7b823-9cf8-4e86-ac80-22981e293e06\") " pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.409401 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.409380 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9w8kc" Apr 22 19:58:51.426121 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.426097 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wvnxp" Apr 22 19:58:51.526459 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.526354 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gh454" Apr 22 19:58:51.558053 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.558022 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9w8kc"] Apr 22 19:58:51.571171 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.571145 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wvnxp"] Apr 22 19:58:51.574320 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:51.574288 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca734b69_52bc_4ae7_9171_1860a1388b9f.slice/crio-1401dc1dc84d6aa2b303f9b370536b46d2d5576814bf6d31fc4fc05696822ecc WatchSource:0}: Error finding container 1401dc1dc84d6aa2b303f9b370536b46d2d5576814bf6d31fc4fc05696822ecc: Status 404 returned error can't find the container with id 1401dc1dc84d6aa2b303f9b370536b46d2d5576814bf6d31fc4fc05696822ecc Apr 22 19:58:51.649810 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:51.649786 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gh454"] Apr 22 19:58:52.294549 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:52.294506 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gh454" event={"ID":"ffd7b823-9cf8-4e86-ac80-22981e293e06","Type":"ContainerStarted","Data":"5977af57497b61b668e575bd193a3dec86cff7a3655086a904e2f605d704168f"} Apr 22 19:58:52.295827 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:52.295776 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wvnxp" event={"ID":"ca734b69-52bc-4ae7-9171-1860a1388b9f","Type":"ContainerStarted","Data":"1401dc1dc84d6aa2b303f9b370536b46d2d5576814bf6d31fc4fc05696822ecc"} Apr 22 19:58:52.297374 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:52.297347 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9w8kc" event={"ID":"c16ab73e-ca33-4cd7-bef1-c05fb49576cf","Type":"ContainerStarted","Data":"d170e0a6d41db0ac552a9e6a9a312439316723ed8918835e9b17ebb22226f882"} Apr 22 19:58:52.297502 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:52.297385 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9w8kc" event={"ID":"c16ab73e-ca33-4cd7-bef1-c05fb49576cf","Type":"ContainerStarted","Data":"6a1263f81a0a174bcd8be1133a51e03d9929aa54002cef154cf0426f97bdb722"} Apr 22 19:58:53.301462 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:53.301420 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9w8kc" event={"ID":"c16ab73e-ca33-4cd7-bef1-c05fb49576cf","Type":"ContainerStarted","Data":"9890e1d25d18c4557d46cbb534121b4cbbe312b2acd32a2fbfb67603afe458a0"} Apr 22 19:58:54.304717 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:54.304674 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wvnxp" event={"ID":"ca734b69-52bc-4ae7-9171-1860a1388b9f","Type":"ContainerStarted","Data":"dfe0fff583f846bd27ce3e73da5f08ee298f19565101e97d87f7b272008c02fd"} Apr 22 19:58:54.306494 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:54.306462 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gh454" event={"ID":"ffd7b823-9cf8-4e86-ac80-22981e293e06","Type":"ContainerStarted","Data":"e65edaf338bca46c891c04454dd1fbb1ab6bca63c27f146cf9e3ed881d7dadf4"} Apr 22 19:58:54.306494 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:54.306494 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gh454" event={"ID":"ffd7b823-9cf8-4e86-ac80-22981e293e06","Type":"ContainerStarted","Data":"edf587e64ee51296ac9df6d2eed5267b95847e8079d3a77d05323665401ea015"} Apr 22 19:58:54.306652 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:54.306637 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gh454" Apr 22 19:58:54.344044 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:54.343572 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wvnxp" podStartSLOduration=1.487194374 podStartE2EDuration="3.343559132s" podCreationTimestamp="2026-04-22 19:58:51 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.57655449 +0000 UTC m=+53.095447508" lastFinishedPulling="2026-04-22 19:58:53.43291925 +0000 UTC m=+54.951812266" observedRunningTime="2026-04-22 19:58:54.320995786 +0000 UTC m=+55.839888820" watchObservedRunningTime="2026-04-22 19:58:54.343559132 +0000 UTC m=+55.862452166" Apr 22 19:58:54.344044 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:54.343712 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gh454" podStartSLOduration=1.572855503 podStartE2EDuration="3.34370654s" podCreationTimestamp="2026-04-22 19:58:51 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.658571788 +0000 UTC m=+53.177464801" lastFinishedPulling="2026-04-22 19:58:53.429422809 +0000 UTC m=+54.948315838" observedRunningTime="2026-04-22 19:58:54.343308257 +0000 UTC m=+55.862201292" watchObservedRunningTime="2026-04-22 19:58:54.34370654 +0000 UTC m=+55.862599588" Apr 22 19:58:55.312053 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:55.312012 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9w8kc" event={"ID":"c16ab73e-ca33-4cd7-bef1-c05fb49576cf","Type":"ContainerStarted","Data":"3968f80ce9980b98d97dd8637907b7f614d49952fb32c693bad2e9276746bcc3"} Apr 22 19:58:55.329977 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:55.329934 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9w8kc" podStartSLOduration=1.153784745 podStartE2EDuration="4.329921005s" podCreationTimestamp="2026-04-22 19:58:51 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.649516333 +0000 UTC m=+53.168409345" lastFinishedPulling="2026-04-22 19:58:54.825652588 +0000 UTC m=+56.344545605" observedRunningTime="2026-04-22 19:58:55.328719853 +0000 UTC m=+56.847612888" watchObservedRunningTime="2026-04-22 19:58:55.329921005 +0000 UTC m=+56.848814040" Apr 22 19:58:56.253135 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:56.253106 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wxp4m" Apr 22 19:58:57.100998 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.100964 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6"] Apr 22 19:58:57.116422 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.116398 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" Apr 22 19:58:57.121831 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.121797 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 19:58:57.121954 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.121839 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-bggkw\"" Apr 22 19:58:57.121954 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.121902 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6"] Apr 22 19:58:57.240792 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.240747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1f2b7cd9-b3ea-4b04-9e45-1d2ff400e912-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kfrl6\" (UID: \"1f2b7cd9-b3ea-4b04-9e45-1d2ff400e912\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" Apr 22 19:58:57.341619 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.341585 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1f2b7cd9-b3ea-4b04-9e45-1d2ff400e912-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kfrl6\" (UID: \"1f2b7cd9-b3ea-4b04-9e45-1d2ff400e912\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" Apr 22 19:58:57.344342 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.344319 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1f2b7cd9-b3ea-4b04-9e45-1d2ff400e912-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-kfrl6\" (UID: \"1f2b7cd9-b3ea-4b04-9e45-1d2ff400e912\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" Apr 22 19:58:57.426094 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.425988 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" Apr 22 19:58:57.560302 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:57.560238 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6"] Apr 22 19:58:57.563902 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:58:57.563875 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2b7cd9_b3ea_4b04_9e45_1d2ff400e912.slice/crio-d6c3442f00a0965435cc031b0f1cfe9c5e23a83981da0e7b4ecb316f31afccf4 WatchSource:0}: Error finding container d6c3442f00a0965435cc031b0f1cfe9c5e23a83981da0e7b4ecb316f31afccf4: Status 404 returned error can't find the container with id d6c3442f00a0965435cc031b0f1cfe9c5e23a83981da0e7b4ecb316f31afccf4 Apr 22 19:58:58.321324 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:58.321283 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" event={"ID":"1f2b7cd9-b3ea-4b04-9e45-1d2ff400e912","Type":"ContainerStarted","Data":"d6c3442f00a0965435cc031b0f1cfe9c5e23a83981da0e7b4ecb316f31afccf4"} Apr 22 19:58:59.324564 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:59.324530 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" event={"ID":"1f2b7cd9-b3ea-4b04-9e45-1d2ff400e912","Type":"ContainerStarted","Data":"bc52a86f3c6834f930c43017500417d7c5c9ad3b9cccad36bf593b09fe297fad"} Apr 22 19:58:59.324996 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:59.324775 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" Apr 22 19:58:59.329575 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:59.329553 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" Apr 22 19:58:59.342822 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:58:59.342780 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-kfrl6" podStartSLOduration=1.263639787 podStartE2EDuration="2.342761233s" podCreationTimestamp="2026-04-22 19:58:57 +0000 UTC" firstStartedPulling="2026-04-22 19:58:57.566121605 +0000 UTC m=+59.085014623" lastFinishedPulling="2026-04-22 19:58:58.645243042 +0000 UTC m=+60.164136069" observedRunningTime="2026-04-22 19:58:59.341687956 +0000 UTC m=+60.860580992" watchObservedRunningTime="2026-04-22 19:58:59.342761233 +0000 UTC m=+60.861654269" Apr 22 19:59:00.141474 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.141436 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fh7v7"] Apr 22 19:59:00.145583 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.145566 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.148049 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.148021 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 19:59:00.148194 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.148042 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:59:00.148194 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.148097 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 19:59:00.148594 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.148576 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:59:00.148641 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.148614 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:59:00.148675 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.148658 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-9w87k\"" Apr 22 19:59:00.152993 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.152976 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fh7v7"] Apr 22 19:59:00.258943 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.258902 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a60d5a0-fc39-4655-8c92-ad8816d682db-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.258943 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.258946 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a60d5a0-fc39-4655-8c92-ad8816d682db-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.259162 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.259028 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a60d5a0-fc39-4655-8c92-ad8816d682db-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.259162 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.259068 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsgxb\" (UniqueName: \"kubernetes.io/projected/5a60d5a0-fc39-4655-8c92-ad8816d682db-kube-api-access-qsgxb\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.359715 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.359670 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a60d5a0-fc39-4655-8c92-ad8816d682db-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.359715 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.359721 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsgxb\" (UniqueName: \"kubernetes.io/projected/5a60d5a0-fc39-4655-8c92-ad8816d682db-kube-api-access-qsgxb\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.360222 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.359796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a60d5a0-fc39-4655-8c92-ad8816d682db-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.360222 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.359824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a60d5a0-fc39-4655-8c92-ad8816d682db-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.360599 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.360488 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a60d5a0-fc39-4655-8c92-ad8816d682db-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.362332 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.362311 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a60d5a0-fc39-4655-8c92-ad8816d682db-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.362417 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.362399 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a60d5a0-fc39-4655-8c92-ad8816d682db-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.367888 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.367859 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsgxb\" (UniqueName: \"kubernetes.io/projected/5a60d5a0-fc39-4655-8c92-ad8816d682db-kube-api-access-qsgxb\") pod \"prometheus-operator-5676c8c784-fh7v7\" (UID: \"5a60d5a0-fc39-4655-8c92-ad8816d682db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.455012 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.454981 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" Apr 22 19:59:00.574514 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:00.574483 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-fh7v7"] Apr 22 19:59:00.578375 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:00.578352 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a60d5a0_fc39_4655_8c92_ad8816d682db.slice/crio-9e8e4f66ec70fcf588ec00e054a685ccce0425c4f192a07bde8dff8577d81c7f WatchSource:0}: Error finding container 9e8e4f66ec70fcf588ec00e054a685ccce0425c4f192a07bde8dff8577d81c7f: Status 404 returned error can't find the container with id 9e8e4f66ec70fcf588ec00e054a685ccce0425c4f192a07bde8dff8577d81c7f Apr 22 19:59:01.331847 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:01.331806 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" event={"ID":"5a60d5a0-fc39-4655-8c92-ad8816d682db","Type":"ContainerStarted","Data":"9e8e4f66ec70fcf588ec00e054a685ccce0425c4f192a07bde8dff8577d81c7f"} Apr 22 19:59:02.335678 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:02.335583 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" event={"ID":"5a60d5a0-fc39-4655-8c92-ad8816d682db","Type":"ContainerStarted","Data":"2ca7bfe8d208bf6938f58eef01ae911ac3a44d4db58ca747b939adfdfeada952"} Apr 22 19:59:02.335678 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:02.335625 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" event={"ID":"5a60d5a0-fc39-4655-8c92-ad8816d682db","Type":"ContainerStarted","Data":"d321acf969f9fd75495c6d60048c0c765f4b911a882dacd3afa91aca61e737f3"} Apr 22 19:59:02.352509 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:02.352467 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-fh7v7" podStartSLOduration=0.858076508 podStartE2EDuration="2.352452462s" podCreationTimestamp="2026-04-22 19:59:00 +0000 UTC" firstStartedPulling="2026-04-22 19:59:00.580562855 +0000 UTC m=+62.099455867" lastFinishedPulling="2026-04-22 19:59:02.074938797 +0000 UTC m=+63.593831821" observedRunningTime="2026-04-22 19:59:02.350907114 +0000 UTC m=+63.869800149" watchObservedRunningTime="2026-04-22 19:59:02.352452462 +0000 UTC m=+63.871345496" Apr 22 19:59:03.686849 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.686812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:59:03.687237 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.686886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:59:03.689458 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.689425 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:59:03.689650 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.689522 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:59:03.699818 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.699792 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:59:03.700108 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.700086 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9641a5d7-3e56-4f40-97db-ff0e3d5cb321-metrics-certs\") pod \"network-metrics-daemon-fjgnl\" (UID: \"9641a5d7-3e56-4f40-97db-ff0e3d5cb321\") " pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:59:03.710393 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.710371 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qnhv\" (UniqueName: \"kubernetes.io/projected/0eaeb73f-d4a2-4a3a-8997-fd78247676aa-kube-api-access-9qnhv\") pod \"network-check-target-l4pll\" (UID: \"0eaeb73f-d4a2-4a3a-8997-fd78247676aa\") " pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:59:03.879561 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.879527 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjpls\"" Apr 22 19:59:03.884516 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.884489 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxcj4\"" Apr 22 19:59:03.887371 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.887351 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:59:03.893155 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:03.893135 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fjgnl" Apr 22 19:59:04.025365 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.025325 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l4pll"] Apr 22 19:59:04.028937 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:04.028909 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eaeb73f_d4a2_4a3a_8997_fd78247676aa.slice/crio-e645b59d3de19197b5d8d5cc9fff7159f33e9169a1e4484ea8d86337fbf3ce50 WatchSource:0}: Error finding container e645b59d3de19197b5d8d5cc9fff7159f33e9169a1e4484ea8d86337fbf3ce50: Status 404 returned error can't find the container with id e645b59d3de19197b5d8d5cc9fff7159f33e9169a1e4484ea8d86337fbf3ce50 Apr 22 19:59:04.040128 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.040106 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fjgnl"] Apr 22 19:59:04.043146 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:04.043118 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9641a5d7_3e56_4f40_97db_ff0e3d5cb321.slice/crio-a7ff0044434bc479ba992c3b86031629e9c9c7eae6f007add0e0e7a774c44b87 WatchSource:0}: Error finding container a7ff0044434bc479ba992c3b86031629e9c9c7eae6f007add0e0e7a774c44b87: Status 404 returned error can't find the container with id a7ff0044434bc479ba992c3b86031629e9c9c7eae6f007add0e0e7a774c44b87 Apr 22 19:59:04.314868 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.314785 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gh454" Apr 22 19:59:04.340831 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.340798 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fjgnl" event={"ID":"9641a5d7-3e56-4f40-97db-ff0e3d5cb321","Type":"ContainerStarted","Data":"a7ff0044434bc479ba992c3b86031629e9c9c7eae6f007add0e0e7a774c44b87"} Apr 22 19:59:04.341907 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.341873 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l4pll" event={"ID":"0eaeb73f-d4a2-4a3a-8997-fd78247676aa","Type":"ContainerStarted","Data":"e645b59d3de19197b5d8d5cc9fff7159f33e9169a1e4484ea8d86337fbf3ce50"} Apr 22 19:59:04.480771 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.480734 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j"] Apr 22 19:59:04.499101 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.499067 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-56gsc"] Apr 22 19:59:04.499340 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.499312 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.502165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.501798 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 19:59:04.502165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.501836 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:59:04.502560 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.502459 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-jqnlc\"" Apr 22 19:59:04.513230 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.513204 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j"] Apr 22 19:59:04.513373 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.513238 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k8cpr"] Apr 22 19:59:04.513741 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.513708 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.516667 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.516643 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 19:59:04.517156 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.517134 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 19:59:04.517298 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.517245 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 19:59:04.517383 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.517144 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-wnfjk\"" Apr 22 19:59:04.535501 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.535476 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-56gsc"] Apr 22 19:59:04.535643 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.535625 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.537987 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.537969 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:59:04.538107 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.538005 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-98m5q\"" Apr 22 19:59:04.538173 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.538132 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:59:04.538173 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.538150 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:59:04.592608 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.592524 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-595nc\" (UniqueName: \"kubernetes.io/projected/294d25c8-e955-4e1e-a99e-5c4f1130d221-kube-api-access-595nc\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.592608 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.592569 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/294d25c8-e955-4e1e-a99e-5c4f1130d221-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.592826 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.592607 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/294d25c8-e955-4e1e-a99e-5c4f1130d221-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.592826 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.592719 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/294d25c8-e955-4e1e-a99e-5c4f1130d221-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.693761 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.693717 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-595nc\" (UniqueName: \"kubernetes.io/projected/294d25c8-e955-4e1e-a99e-5c4f1130d221-kube-api-access-595nc\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.693773 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.693815 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzg9z\" (UniqueName: \"kubernetes.io/projected/3512a00a-a9e9-4a62-b382-f679dbdd1b67-kube-api-access-mzg9z\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.693846 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-accelerators-collector-config\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.693875 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-tls\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.693907 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.693937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3512a00a-a9e9-4a62-b382-f679dbdd1b67-metrics-client-ca\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.693970 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.693993 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-wtmp\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/294d25c8-e955-4e1e-a99e-5c4f1130d221-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694058 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vc4r\" (UniqueName: \"kubernetes.io/projected/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-api-access-8vc4r\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694088 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3512a00a-a9e9-4a62-b382-f679dbdd1b67-root\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.694165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694116 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/294d25c8-e955-4e1e-a99e-5c4f1130d221-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.694737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694170 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-textfile\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.694737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694210 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.694737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1df16191-fb27-4b2c-b54d-efc9ceebda35-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.694737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3512a00a-a9e9-4a62-b382-f679dbdd1b67-sys\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.694737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694301 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1df16191-fb27-4b2c-b54d-efc9ceebda35-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.694737 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.694323 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/294d25c8-e955-4e1e-a99e-5c4f1130d221-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.694737 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:04.694424 2580 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 19:59:04.694737 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:04.694477 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/294d25c8-e955-4e1e-a99e-5c4f1130d221-openshift-state-metrics-tls podName:294d25c8-e955-4e1e-a99e-5c4f1130d221 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.194461405 +0000 UTC m=+66.713354419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/294d25c8-e955-4e1e-a99e-5c4f1130d221-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-ffj7j" (UID: "294d25c8-e955-4e1e-a99e-5c4f1130d221") : secret "openshift-state-metrics-tls" not found Apr 22 19:59:04.697418 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.697363 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/294d25c8-e955-4e1e-a99e-5c4f1130d221-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.708200 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.708160 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/294d25c8-e955-4e1e-a99e-5c4f1130d221-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.710836 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.710797 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-595nc\" (UniqueName: \"kubernetes.io/projected/294d25c8-e955-4e1e-a99e-5c4f1130d221-kube-api-access-595nc\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:04.795617 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.795577 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.795790 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.795631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzg9z\" (UniqueName: \"kubernetes.io/projected/3512a00a-a9e9-4a62-b382-f679dbdd1b67-kube-api-access-mzg9z\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.795790 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:04.795761 2580 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 19:59:04.795915 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.795789 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-accelerators-collector-config\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.795915 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:04.795835 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-tls podName:1df16191-fb27-4b2c-b54d-efc9ceebda35 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.295812579 +0000 UTC m=+66.814705614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-56gsc" (UID: "1df16191-fb27-4b2c-b54d-efc9ceebda35") : secret "kube-state-metrics-tls" not found Apr 22 19:59:04.795915 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.795876 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-tls\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796072 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.795916 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796072 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.795951 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3512a00a-a9e9-4a62-b382-f679dbdd1b67-metrics-client-ca\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796072 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:04.795984 2580 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:59:04.796072 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.795997 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.796072 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:04.796050 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-tls podName:3512a00a-a9e9-4a62-b382-f679dbdd1b67 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.296031691 +0000 UTC m=+66.814924718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-tls") pod "node-exporter-k8cpr" (UID: "3512a00a-a9e9-4a62-b382-f679dbdd1b67") : secret "node-exporter-tls" not found Apr 22 19:59:04.796364 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796078 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-wtmp\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796364 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vc4r\" (UniqueName: \"kubernetes.io/projected/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-api-access-8vc4r\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.796364 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796156 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3512a00a-a9e9-4a62-b382-f679dbdd1b67-root\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796364 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796187 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-textfile\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796364 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796214 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.796364 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796246 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1df16191-fb27-4b2c-b54d-efc9ceebda35-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.796364 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796301 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3512a00a-a9e9-4a62-b382-f679dbdd1b67-sys\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796364 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796325 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1df16191-fb27-4b2c-b54d-efc9ceebda35-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.796728 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796447 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-accelerators-collector-config\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796728 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796576 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3512a00a-a9e9-4a62-b382-f679dbdd1b67-metrics-client-ca\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796728 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796601 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-textfile\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796728 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1df16191-fb27-4b2c-b54d-efc9ceebda35-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.796728 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796713 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3512a00a-a9e9-4a62-b382-f679dbdd1b67-root\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.796728 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796719 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-wtmp\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.797015 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.796769 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3512a00a-a9e9-4a62-b382-f679dbdd1b67-sys\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.797414 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.797373 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1df16191-fb27-4b2c-b54d-efc9ceebda35-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.799333 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.799284 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.799488 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.799370 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.808634 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.808603 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:04.809511 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.809467 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzg9z\" (UniqueName: \"kubernetes.io/projected/3512a00a-a9e9-4a62-b382-f679dbdd1b67-kube-api-access-mzg9z\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:04.813644 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:04.813340 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vc4r\" (UniqueName: \"kubernetes.io/projected/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-api-access-8vc4r\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:05.200277 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.200216 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/294d25c8-e955-4e1e-a99e-5c4f1130d221-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:05.203064 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.203040 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/294d25c8-e955-4e1e-a99e-5c4f1130d221-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-ffj7j\" (UID: \"294d25c8-e955-4e1e-a99e-5c4f1130d221\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:05.301297 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.301236 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:05.301461 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.301325 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-tls\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:05.301461 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:05.301400 2580 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 19:59:05.301550 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:05.301471 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-tls podName:1df16191-fb27-4b2c-b54d-efc9ceebda35 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:06.30145471 +0000 UTC m=+67.820347743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-56gsc" (UID: "1df16191-fb27-4b2c-b54d-efc9ceebda35") : secret "kube-state-metrics-tls" not found Apr 22 19:59:05.304208 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.304182 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3512a00a-a9e9-4a62-b382-f679dbdd1b67-node-exporter-tls\") pod \"node-exporter-k8cpr\" (UID: \"3512a00a-a9e9-4a62-b382-f679dbdd1b67\") " pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:05.413084 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.413045 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" Apr 22 19:59:05.449160 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.449124 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k8cpr" Apr 22 19:59:05.603197 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.603123 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:59:05.630574 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.630549 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:59:05.630727 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.630714 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.633573 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.633550 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 19:59:05.633694 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.633550 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 19:59:05.633694 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.633637 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 19:59:05.633847 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.633829 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 19:59:05.634339 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.634019 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 19:59:05.634339 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.634074 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 19:59:05.634339 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.634077 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 19:59:05.634339 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.634096 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2pj74\"" Apr 22 19:59:05.634339 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.634087 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 19:59:05.634339 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.634230 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 19:59:05.805795 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.805765 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.805816 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-config-volume\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.805865 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.805893 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-config-out\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.805923 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.805950 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9klp\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-kube-api-access-w9klp\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.805979 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.806016 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.806040 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-web-config\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.806076 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.806119 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806147 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.806143 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806647 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.806185 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.806905 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.806879 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j"] Apr 22 19:59:05.809798 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:05.809772 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod294d25c8_e955_4e1e_a99e_5c4f1130d221.slice/crio-a0c09b6a8f8f1d1fed47fd8375f60b4d09ddae50841409ffac5f97724822a0ce WatchSource:0}: Error finding container a0c09b6a8f8f1d1fed47fd8375f60b4d09ddae50841409ffac5f97724822a0ce: Status 404 returned error can't find the container with id a0c09b6a8f8f1d1fed47fd8375f60b4d09ddae50841409ffac5f97724822a0ce Apr 22 19:59:05.906701 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.906673 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-config-volume\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.906848 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.906724 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.906848 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.906754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-config-out\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.906848 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.906783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.906848 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.906808 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9klp\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-kube-api-access-w9klp\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.907063 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.906958 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.907063 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.907019 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.907063 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.907048 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-web-config\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.907301 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.907087 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.907301 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.907129 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.907301 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.907156 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.907301 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.907200 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.907301 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.907238 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.908590 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.908112 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.908590 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.908387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.910265 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.910200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.911005 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.910980 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-config-volume\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.911112 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.911074 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-config-out\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.917987 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.912413 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.917987 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.912490 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.917987 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.912703 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.917987 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.912865 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-web-config\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.917987 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.913036 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.918943 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.918915 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.920331 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.920304 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9klp\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-kube-api-access-w9klp\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.923888 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.923841 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:05.942771 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:05.942305 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:59:06.095161 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.095104 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:59:06.309690 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.309644 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:06.312387 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.312363 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1df16191-fb27-4b2c-b54d-efc9ceebda35-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-56gsc\" (UID: \"1df16191-fb27-4b2c-b54d-efc9ceebda35\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:06.327227 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.327195 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" Apr 22 19:59:06.349909 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.349870 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" event={"ID":"294d25c8-e955-4e1e-a99e-5c4f1130d221","Type":"ContainerStarted","Data":"5487a651b6ee6ab1c14bf638a4beb1072f7b72d1e96d9fb70023f70737e3d1d3"} Apr 22 19:59:06.350127 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.349931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" event={"ID":"294d25c8-e955-4e1e-a99e-5c4f1130d221","Type":"ContainerStarted","Data":"a0c09b6a8f8f1d1fed47fd8375f60b4d09ddae50841409ffac5f97724822a0ce"} Apr 22 19:59:06.351202 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.351175 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8cpr" event={"ID":"3512a00a-a9e9-4a62-b382-f679dbdd1b67","Type":"ContainerStarted","Data":"c27607684494eb0b83e9efac1ccdf30c0872309c4d4a501089706960674d7e7f"} Apr 22 19:59:06.352968 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.352942 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fjgnl" event={"ID":"9641a5d7-3e56-4f40-97db-ff0e3d5cb321","Type":"ContainerStarted","Data":"93b319650900e5e167221e8e8cc8314c766553844d4bb83a88aa4c3457a00267"} Apr 22 19:59:06.353080 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.352971 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fjgnl" event={"ID":"9641a5d7-3e56-4f40-97db-ff0e3d5cb321","Type":"ContainerStarted","Data":"003fcf21762b2ed0030a60df19b2071ca3d105f60fc4fb4a19652a403d24b254"} Apr 22 19:59:06.368644 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.368593 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fjgnl" podStartSLOduration=65.789821514 podStartE2EDuration="1m7.368575329s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:59:04.044997082 +0000 UTC m=+65.563890096" lastFinishedPulling="2026-04-22 19:59:05.623750883 +0000 UTC m=+67.142643911" observedRunningTime="2026-04-22 19:59:06.36844489 +0000 UTC m=+67.887337958" watchObservedRunningTime="2026-04-22 19:59:06.368575329 +0000 UTC m=+67.887468365" Apr 22 19:59:06.609855 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.609764 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7"] Apr 22 19:59:06.614603 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.614130 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.617396 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.617204 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 19:59:06.617396 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.617266 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 19:59:06.617396 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.617265 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 19:59:06.617396 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.617269 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 19:59:06.618484 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.618276 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 19:59:06.618484 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.618350 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6hks5eui8er0a\"" Apr 22 19:59:06.618484 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.618287 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-6hgs6\"" Apr 22 19:59:06.625365 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.625343 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7"] Apr 22 19:59:06.712523 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.712487 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-metrics-client-ca\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.712685 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.712548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.712685 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.712584 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.712685 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.712623 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-tls\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.712685 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.712665 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.712867 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.712736 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkkl4\" (UniqueName: \"kubernetes.io/projected/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-kube-api-access-qkkl4\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.712867 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.712774 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-grpc-tls\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.712867 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.712826 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.814009 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.813965 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.814510 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.814027 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkkl4\" (UniqueName: \"kubernetes.io/projected/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-kube-api-access-qkkl4\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.814510 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.814059 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-grpc-tls\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.814510 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.814103 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.814510 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.814149 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-metrics-client-ca\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.814510 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.814217 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.814510 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.814289 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.814510 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.814339 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-tls\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.815027 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.814979 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-metrics-client-ca\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.817165 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.817137 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.817820 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.817786 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-grpc-tls\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.818235 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.818184 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.818406 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.818344 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.818514 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.818406 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.818581 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.818567 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-secret-thanos-querier-tls\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.824295 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.824273 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkkl4\" (UniqueName: \"kubernetes.io/projected/3a405cd9-a1f0-440e-a6a7-dbcea8867edb-kube-api-access-qkkl4\") pod \"thanos-querier-7b4fdb9788-4hxj7\" (UID: \"3a405cd9-a1f0-440e-a6a7-dbcea8867edb\") " pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.925624 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:06.925525 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:06.967590 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:06.967558 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5b894a9_0010_423c_9537_f336848aa436.slice/crio-b8131c6f579c2f907dd7d20de0d640446e0d68e11d8d7350e728d6636c831b7b WatchSource:0}: Error finding container b8131c6f579c2f907dd7d20de0d640446e0d68e11d8d7350e728d6636c831b7b: Status 404 returned error can't find the container with id b8131c6f579c2f907dd7d20de0d640446e0d68e11d8d7350e728d6636c831b7b Apr 22 19:59:07.132116 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:07.132060 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7"] Apr 22 19:59:07.136980 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:07.136948 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a405cd9_a1f0_440e_a6a7_dbcea8867edb.slice/crio-9a626b2f448bfe89885bb09c866a93f1ec877ce45007bed0d1b8038e89444415 WatchSource:0}: Error finding container 9a626b2f448bfe89885bb09c866a93f1ec877ce45007bed0d1b8038e89444415: Status 404 returned error can't find the container with id 9a626b2f448bfe89885bb09c866a93f1ec877ce45007bed0d1b8038e89444415 Apr 22 19:59:07.146194 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:07.146165 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-56gsc"] Apr 22 19:59:07.149510 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:07.149484 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1df16191_fb27_4b2c_b54d_efc9ceebda35.slice/crio-a1e5ec2cf7b0a292adba1205f94528c025be602564974d4730dd8b0c52ab2ce4 WatchSource:0}: Error finding container a1e5ec2cf7b0a292adba1205f94528c025be602564974d4730dd8b0c52ab2ce4: Status 404 returned error can't find the container with id a1e5ec2cf7b0a292adba1205f94528c025be602564974d4730dd8b0c52ab2ce4 Apr 22 19:59:07.363201 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:07.363157 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" event={"ID":"294d25c8-e955-4e1e-a99e-5c4f1130d221","Type":"ContainerStarted","Data":"487074c144615a7762dd6aeeb6670b9f192a7f18016ca03a320dd12a770264fd"} Apr 22 19:59:07.364493 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:07.364458 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" event={"ID":"3a405cd9-a1f0-440e-a6a7-dbcea8867edb","Type":"ContainerStarted","Data":"9a626b2f448bfe89885bb09c866a93f1ec877ce45007bed0d1b8038e89444415"} Apr 22 19:59:07.366072 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:07.366042 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l4pll" event={"ID":"0eaeb73f-d4a2-4a3a-8997-fd78247676aa","Type":"ContainerStarted","Data":"77617e43c4343a863cd85452f9039ca861344fc53f7770ce34d60bc6669169d9"} Apr 22 19:59:07.366246 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:07.366203 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:59:07.367348 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:07.367325 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerStarted","Data":"b8131c6f579c2f907dd7d20de0d640446e0d68e11d8d7350e728d6636c831b7b"} Apr 22 19:59:07.368448 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:07.368422 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" event={"ID":"1df16191-fb27-4b2c-b54d-efc9ceebda35","Type":"ContainerStarted","Data":"a1e5ec2cf7b0a292adba1205f94528c025be602564974d4730dd8b0c52ab2ce4"} Apr 22 19:59:07.383902 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:07.383849 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-l4pll" podStartSLOduration=65.383047589 podStartE2EDuration="1m8.383831813s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:59:04.031003771 +0000 UTC m=+65.549896784" lastFinishedPulling="2026-04-22 19:59:07.031787989 +0000 UTC m=+68.550681008" observedRunningTime="2026-04-22 19:59:07.382514013 +0000 UTC m=+68.901407051" watchObservedRunningTime="2026-04-22 19:59:07.383831813 +0000 UTC m=+68.902724849" Apr 22 19:59:08.373784 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.373740 2580 generic.go:358] "Generic (PLEG): container finished" podID="3512a00a-a9e9-4a62-b382-f679dbdd1b67" containerID="f0ab06eee70a8d48f95aba5b87ad0f7f5110561dd4e66ff120e571e04ae9dcf8" exitCode=0 Apr 22 19:59:08.374388 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.373860 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8cpr" event={"ID":"3512a00a-a9e9-4a62-b382-f679dbdd1b67","Type":"ContainerDied","Data":"f0ab06eee70a8d48f95aba5b87ad0f7f5110561dd4e66ff120e571e04ae9dcf8"} Apr 22 19:59:08.934811 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.934775 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5548c4d88b-9bwd8"] Apr 22 19:59:08.938282 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.938242 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:08.942190 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.942133 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 19:59:08.942735 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.942407 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 19:59:08.942735 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.942503 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-vxrvf\"" Apr 22 19:59:08.943679 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.942950 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:59:08.943679 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.943129 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4pgf0hv8kne2p\"" Apr 22 19:59:08.943679 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.943309 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 19:59:08.948808 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:08.948428 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5548c4d88b-9bwd8"] Apr 22 19:59:09.036906 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.036872 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2f49c1-8615-4b4d-af75-4ba188e460f0-client-ca-bundle\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.037094 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.036924 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc2f49c1-8615-4b4d-af75-4ba188e460f0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.037094 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.037003 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7k2\" (UniqueName: \"kubernetes.io/projected/cc2f49c1-8615-4b4d-af75-4ba188e460f0-kube-api-access-lz7k2\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.037094 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.037036 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/cc2f49c1-8615-4b4d-af75-4ba188e460f0-audit-log\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.037094 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.037087 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/cc2f49c1-8615-4b4d-af75-4ba188e460f0-metrics-server-audit-profiles\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.037291 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.037134 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/cc2f49c1-8615-4b4d-af75-4ba188e460f0-secret-metrics-server-client-certs\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.037291 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.037158 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/cc2f49c1-8615-4b4d-af75-4ba188e460f0-secret-metrics-server-tls\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.137817 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.137659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2f49c1-8615-4b4d-af75-4ba188e460f0-client-ca-bundle\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.137817 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.137713 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc2f49c1-8615-4b4d-af75-4ba188e460f0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.137817 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.137750 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7k2\" (UniqueName: \"kubernetes.io/projected/cc2f49c1-8615-4b4d-af75-4ba188e460f0-kube-api-access-lz7k2\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.137817 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.137778 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/cc2f49c1-8615-4b4d-af75-4ba188e460f0-audit-log\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.137817 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.137815 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/cc2f49c1-8615-4b4d-af75-4ba188e460f0-metrics-server-audit-profiles\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.138812 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.137850 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/cc2f49c1-8615-4b4d-af75-4ba188e460f0-secret-metrics-server-client-certs\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.138812 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.137873 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/cc2f49c1-8615-4b4d-af75-4ba188e460f0-secret-metrics-server-tls\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.139158 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.139128 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/cc2f49c1-8615-4b4d-af75-4ba188e460f0-audit-log\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.139979 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.139953 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc2f49c1-8615-4b4d-af75-4ba188e460f0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.140159 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.140003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/cc2f49c1-8615-4b4d-af75-4ba188e460f0-metrics-server-audit-profiles\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.140832 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.140810 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/cc2f49c1-8615-4b4d-af75-4ba188e460f0-secret-metrics-server-tls\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.140925 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.140889 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc2f49c1-8615-4b4d-af75-4ba188e460f0-client-ca-bundle\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.142083 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.142061 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/cc2f49c1-8615-4b4d-af75-4ba188e460f0-secret-metrics-server-client-certs\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.158770 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.158737 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7k2\" (UniqueName: \"kubernetes.io/projected/cc2f49c1-8615-4b4d-af75-4ba188e460f0-kube-api-access-lz7k2\") pod \"metrics-server-5548c4d88b-9bwd8\" (UID: \"cc2f49c1-8615-4b4d-af75-4ba188e460f0\") " pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.250303 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.250267 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:09.313835 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.313798 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg"] Apr 22 19:59:09.318412 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.318387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" Apr 22 19:59:09.320948 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.320913 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-wqrr2\"" Apr 22 19:59:09.320948 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.320919 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 19:59:09.326899 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.326684 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg"] Apr 22 19:59:09.339494 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.339460 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3aae5d5c-3e3d-41f7-8936-de19245f2588-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rt8sg\" (UID: \"3aae5d5c-3e3d-41f7-8936-de19245f2588\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" Apr 22 19:59:09.379469 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.379392 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" event={"ID":"294d25c8-e955-4e1e-a99e-5c4f1130d221","Type":"ContainerStarted","Data":"ace27723b99ca3b6c76302cbdfc5ccba324c24da89ca9980a21e7435767233c2"} Apr 22 19:59:09.381318 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.381287 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8cpr" event={"ID":"3512a00a-a9e9-4a62-b382-f679dbdd1b67","Type":"ContainerStarted","Data":"b1176c1b265b41ecf01fc91c4c98668b1b778f58afdc89111387aefad399f38e"} Apr 22 19:59:09.401859 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.401754 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-ffj7j" podStartSLOduration=3.6151012639999998 podStartE2EDuration="5.401736954s" podCreationTimestamp="2026-04-22 19:59:04 +0000 UTC" firstStartedPulling="2026-04-22 19:59:07.04366942 +0000 UTC m=+68.562562446" lastFinishedPulling="2026-04-22 19:59:08.830305122 +0000 UTC m=+70.349198136" observedRunningTime="2026-04-22 19:59:09.399474075 +0000 UTC m=+70.918367111" watchObservedRunningTime="2026-04-22 19:59:09.401736954 +0000 UTC m=+70.920629988" Apr 22 19:59:09.440591 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.440560 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3aae5d5c-3e3d-41f7-8936-de19245f2588-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rt8sg\" (UID: \"3aae5d5c-3e3d-41f7-8936-de19245f2588\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" Apr 22 19:59:09.440691 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:09.440671 2580 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 19:59:09.440741 ip-10-0-131-194 kubenswrapper[2580]: E0422 19:59:09.440736 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3aae5d5c-3e3d-41f7-8936-de19245f2588-monitoring-plugin-cert podName:3aae5d5c-3e3d-41f7-8936-de19245f2588 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:09.940714497 +0000 UTC m=+71.459607514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/3aae5d5c-3e3d-41f7-8936-de19245f2588-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-rt8sg" (UID: "3aae5d5c-3e3d-41f7-8936-de19245f2588") : secret "monitoring-plugin-cert" not found Apr 22 19:59:09.573470 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.573355 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5548c4d88b-9bwd8"] Apr 22 19:59:09.579857 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:09.579829 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc2f49c1_8615_4b4d_af75_4ba188e460f0.slice/crio-3ae60dbcb3b5edd7f7cdedc72156c82c4f15f049376644973c3fe8374f07af82 WatchSource:0}: Error finding container 3ae60dbcb3b5edd7f7cdedc72156c82c4f15f049376644973c3fe8374f07af82: Status 404 returned error can't find the container with id 3ae60dbcb3b5edd7f7cdedc72156c82c4f15f049376644973c3fe8374f07af82 Apr 22 19:59:09.766193 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.766156 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5ffdb9456-lf64t"] Apr 22 19:59:09.781490 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.781019 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.782656 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.782577 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5ffdb9456-lf64t"] Apr 22 19:59:09.783954 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.783933 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 19:59:09.784217 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.783998 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 19:59:09.784321 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.784275 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 19:59:09.784383 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.784157 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 19:59:09.784443 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.784151 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 19:59:09.784804 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.784784 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-smm7p\"" Apr 22 19:59:09.788980 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.788953 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 19:59:09.845914 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.845875 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46272374-b555-4cc1-bf00-d09835122abb-serving-certs-ca-bundle\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.845914 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.845935 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcl86\" (UniqueName: \"kubernetes.io/projected/46272374-b555-4cc1-bf00-d09835122abb-kube-api-access-wcl86\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.846235 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.846030 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46272374-b555-4cc1-bf00-d09835122abb-metrics-client-ca\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.846235 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.846063 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.846235 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.846100 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-secret-telemeter-client\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.846235 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.846131 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46272374-b555-4cc1-bf00-d09835122abb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.846464 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.846322 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-telemeter-client-tls\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.846464 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.846370 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-federate-client-tls\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.947643 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.947606 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-federate-client-tls\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.947850 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.947656 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46272374-b555-4cc1-bf00-d09835122abb-serving-certs-ca-bundle\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.947850 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.947678 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcl86\" (UniqueName: \"kubernetes.io/projected/46272374-b555-4cc1-bf00-d09835122abb-kube-api-access-wcl86\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.947850 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.947714 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3aae5d5c-3e3d-41f7-8936-de19245f2588-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rt8sg\" (UID: \"3aae5d5c-3e3d-41f7-8936-de19245f2588\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" Apr 22 19:59:09.947850 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.947752 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46272374-b555-4cc1-bf00-d09835122abb-metrics-client-ca\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.947850 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.947773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.947850 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.947807 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-secret-telemeter-client\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.947850 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.947850 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46272374-b555-4cc1-bf00-d09835122abb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.948287 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.947976 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-telemeter-client-tls\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.948801 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.948765 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46272374-b555-4cc1-bf00-d09835122abb-serving-certs-ca-bundle\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.949379 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.949352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46272374-b555-4cc1-bf00-d09835122abb-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.949723 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.949703 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46272374-b555-4cc1-bf00-d09835122abb-metrics-client-ca\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.950751 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.950711 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3aae5d5c-3e3d-41f7-8936-de19245f2588-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rt8sg\" (UID: \"3aae5d5c-3e3d-41f7-8936-de19245f2588\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" Apr 22 19:59:09.950961 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.950942 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-federate-client-tls\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.951212 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.951185 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-secret-telemeter-client\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.951417 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.951399 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-telemeter-client-tls\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.951491 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.951470 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46272374-b555-4cc1-bf00-d09835122abb-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:09.956338 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:09.956313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcl86\" (UniqueName: \"kubernetes.io/projected/46272374-b555-4cc1-bf00-d09835122abb-kube-api-access-wcl86\") pod \"telemeter-client-5ffdb9456-lf64t\" (UID: \"46272374-b555-4cc1-bf00-d09835122abb\") " pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:10.095965 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.095927 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" Apr 22 19:59:10.230708 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.230615 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" Apr 22 19:59:10.242761 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.242718 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5ffdb9456-lf64t"] Apr 22 19:59:10.245454 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:10.245421 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46272374_b555_4cc1_bf00_d09835122abb.slice/crio-bc4ba0ad330d271a30ed46b6fb1781250592e2594d7f952c0b52d27eb34ec83e WatchSource:0}: Error finding container bc4ba0ad330d271a30ed46b6fb1781250592e2594d7f952c0b52d27eb34ec83e: Status 404 returned error can't find the container with id bc4ba0ad330d271a30ed46b6fb1781250592e2594d7f952c0b52d27eb34ec83e Apr 22 19:59:10.367521 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.367489 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg"] Apr 22 19:59:10.370902 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:10.370863 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aae5d5c_3e3d_41f7_8936_de19245f2588.slice/crio-4b5c0f4847ea38d4499f4b48d279920f2c72a56cfec6293f1c21a3da93a5c6fc WatchSource:0}: Error finding container 4b5c0f4847ea38d4499f4b48d279920f2c72a56cfec6293f1c21a3da93a5c6fc: Status 404 returned error can't find the container with id 4b5c0f4847ea38d4499f4b48d279920f2c72a56cfec6293f1c21a3da93a5c6fc Apr 22 19:59:10.387106 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.387049 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" event={"ID":"1df16191-fb27-4b2c-b54d-efc9ceebda35","Type":"ContainerStarted","Data":"66a095949057090c6c86de48d069052ae345d66298edfb6d4b7daca00787bf34"} Apr 22 19:59:10.387106 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.387094 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" event={"ID":"1df16191-fb27-4b2c-b54d-efc9ceebda35","Type":"ContainerStarted","Data":"f2a50dfc842dc03611048c7a58e4f0479ada9e8682ded0a29e9e26b181a081fa"} Apr 22 19:59:10.387106 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.387109 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" event={"ID":"1df16191-fb27-4b2c-b54d-efc9ceebda35","Type":"ContainerStarted","Data":"3afbbb9b79871fd0fb3ff67a7d297d8f8b6bca4aa32ca0f00d1fbafb8edb2fb1"} Apr 22 19:59:10.392587 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.392546 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k8cpr" event={"ID":"3512a00a-a9e9-4a62-b382-f679dbdd1b67","Type":"ContainerStarted","Data":"63c3465b0764520e93fc189ef0e40d32071dbe3cc1800b1ee842175431014c0e"} Apr 22 19:59:10.395857 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.395823 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" event={"ID":"3a405cd9-a1f0-440e-a6a7-dbcea8867edb","Type":"ContainerStarted","Data":"790898563bda2c72d1a81a705bc793cc9409720773f9ee93f65186ec2189b73d"} Apr 22 19:59:10.395974 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.395865 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" event={"ID":"3a405cd9-a1f0-440e-a6a7-dbcea8867edb","Type":"ContainerStarted","Data":"33db083f59d64d905a59443031c357c2720dde08ba312649332e8040f8c65787"} Apr 22 19:59:10.395974 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.395879 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" event={"ID":"3a405cd9-a1f0-440e-a6a7-dbcea8867edb","Type":"ContainerStarted","Data":"68576f821f3085e41b4cc268ade30f67cb0961acd224b0daf9a7898f4f4272c9"} Apr 22 19:59:10.398067 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.397386 2580 generic.go:358] "Generic (PLEG): container finished" podID="b5b894a9-0010-423c-9537-f336848aa436" containerID="2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7" exitCode=0 Apr 22 19:59:10.398067 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.397650 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerDied","Data":"2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7"} Apr 22 19:59:10.399099 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.399076 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" event={"ID":"3aae5d5c-3e3d-41f7-8936-de19245f2588","Type":"ContainerStarted","Data":"4b5c0f4847ea38d4499f4b48d279920f2c72a56cfec6293f1c21a3da93a5c6fc"} Apr 22 19:59:10.400494 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.400460 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" event={"ID":"cc2f49c1-8615-4b4d-af75-4ba188e460f0","Type":"ContainerStarted","Data":"3ae60dbcb3b5edd7f7cdedc72156c82c4f15f049376644973c3fe8374f07af82"} Apr 22 19:59:10.402026 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.402004 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" event={"ID":"46272374-b555-4cc1-bf00-d09835122abb","Type":"ContainerStarted","Data":"bc4ba0ad330d271a30ed46b6fb1781250592e2594d7f952c0b52d27eb34ec83e"} Apr 22 19:59:10.407542 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.407492 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-56gsc" podStartSLOduration=4.728818624 podStartE2EDuration="6.40747712s" podCreationTimestamp="2026-04-22 19:59:04 +0000 UTC" firstStartedPulling="2026-04-22 19:59:07.151649701 +0000 UTC m=+68.670542717" lastFinishedPulling="2026-04-22 19:59:08.830308192 +0000 UTC m=+70.349201213" observedRunningTime="2026-04-22 19:59:10.405921185 +0000 UTC m=+71.924814234" watchObservedRunningTime="2026-04-22 19:59:10.40747712 +0000 UTC m=+71.926370156" Apr 22 19:59:10.428218 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.428164 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k8cpr" podStartSLOduration=4.618055942 podStartE2EDuration="6.428148386s" podCreationTimestamp="2026-04-22 19:59:04 +0000 UTC" firstStartedPulling="2026-04-22 19:59:05.62707829 +0000 UTC m=+67.145971318" lastFinishedPulling="2026-04-22 19:59:07.437170746 +0000 UTC m=+68.956063762" observedRunningTime="2026-04-22 19:59:10.426672336 +0000 UTC m=+71.945565372" watchObservedRunningTime="2026-04-22 19:59:10.428148386 +0000 UTC m=+71.947041611" Apr 22 19:59:10.768843 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.768802 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:59:10.778471 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.778436 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.781265 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.781232 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mv6l8\"" Apr 22 19:59:10.781612 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.781592 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8ivqgt401gq4c\"" Apr 22 19:59:10.781932 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.781613 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 19:59:10.782050 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.781613 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 19:59:10.782366 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.781738 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 19:59:10.782459 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.782047 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 19:59:10.782459 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.782111 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 19:59:10.782459 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.782162 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 19:59:10.782628 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.782299 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 19:59:10.782628 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.782348 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 19:59:10.782719 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.782664 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 19:59:10.785417 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.784294 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 19:59:10.787217 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.787146 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 19:59:10.798245 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.798220 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 19:59:10.799221 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.799194 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:59:10.856114 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856070 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856288 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856169 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856288 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856209 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856431 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856327 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856431 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856393 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmq47\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-kube-api-access-jmq47\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856533 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856434 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856533 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856463 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856533 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856489 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856655 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856538 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config-out\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856655 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856583 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856655 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856655 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856647 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-web-config\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856783 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856672 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856783 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856708 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856783 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856744 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856783 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856768 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856941 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856798 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.856941 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.856827 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.957648 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.957612 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.957829 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.957654 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.957829 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.957802 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.957938 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.957874 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.957938 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.957897 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmq47\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-kube-api-access-jmq47\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.957938 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.957932 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.958798 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.957963 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.958798 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.957990 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.958798 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.958038 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config-out\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.958798 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.958061 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.958798 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.958074 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.958798 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.958120 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.958798 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.958147 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-web-config\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.959425 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.959352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.963084 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.963034 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.963537 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.963508 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.963654 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.963575 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.963654 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.963615 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.963828 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.963660 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.963828 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.963688 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.963828 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.963731 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.963828 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.963760 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.964031 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.963896 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.964861 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.964541 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.964861 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.964615 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.964861 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.964822 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.965350 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.965320 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.966316 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.966292 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.966421 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.966362 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config-out\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.966682 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.966661 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.967657 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.967630 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.967939 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.967914 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-web-config\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.968084 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.968061 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.968912 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.968887 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.970205 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.970183 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmq47\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-kube-api-access-jmq47\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:10.970414 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:10.970397 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.103163 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:11.101941 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:11.415049 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:11.414971 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" event={"ID":"cc2f49c1-8615-4b4d-af75-4ba188e460f0","Type":"ContainerStarted","Data":"4ebf87bef33dd8a5f94c39275754bd42cd11366db1b6ce1bbf36c8f4af5bbb82"} Apr 22 19:59:11.422488 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:11.421750 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" event={"ID":"3a405cd9-a1f0-440e-a6a7-dbcea8867edb","Type":"ContainerStarted","Data":"20c4df01bbeea86e3140a2d77e470f246f1f3e5ef4c3f74a94e0e0f292ab939b"} Apr 22 19:59:11.434943 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:11.433449 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" podStartSLOduration=1.7449365810000002 podStartE2EDuration="3.433430488s" podCreationTimestamp="2026-04-22 19:59:08 +0000 UTC" firstStartedPulling="2026-04-22 19:59:09.583406936 +0000 UTC m=+71.102299963" lastFinishedPulling="2026-04-22 19:59:11.27190085 +0000 UTC m=+72.790793870" observedRunningTime="2026-04-22 19:59:11.432628525 +0000 UTC m=+72.951521561" watchObservedRunningTime="2026-04-22 19:59:11.433430488 +0000 UTC m=+72.952323524" Apr 22 19:59:11.448848 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:11.448709 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 19:59:11.451599 ip-10-0-131-194 kubenswrapper[2580]: W0422 19:59:11.451571 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd6b8d80_91b9_4b46_8aba_4bc5da59a846.slice/crio-0cd9b5c2fd02f8eaa2b801f717011236d37e9df0dacd07a38e18b67e4ee6ba97 WatchSource:0}: Error finding container 0cd9b5c2fd02f8eaa2b801f717011236d37e9df0dacd07a38e18b67e4ee6ba97: Status 404 returned error can't find the container with id 0cd9b5c2fd02f8eaa2b801f717011236d37e9df0dacd07a38e18b67e4ee6ba97 Apr 22 19:59:12.427886 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:12.427820 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" event={"ID":"3a405cd9-a1f0-440e-a6a7-dbcea8867edb","Type":"ContainerStarted","Data":"6d2d8fadd98031fd129ce62df83ef7abaa4b71dadd6811113abb3c2625b3a3a3"} Apr 22 19:59:12.428473 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:12.428408 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:12.428473 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:12.428438 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" event={"ID":"3a405cd9-a1f0-440e-a6a7-dbcea8867edb","Type":"ContainerStarted","Data":"db6de03ce7feb7957ceec4de3412a57e4fbb274e997e199d7f9d015d464c696b"} Apr 22 19:59:12.429770 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:12.429742 2580 generic.go:358] "Generic (PLEG): container finished" podID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerID="f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513" exitCode=0 Apr 22 19:59:12.429853 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:12.429820 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerDied","Data":"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513"} Apr 22 19:59:12.429853 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:12.429846 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerStarted","Data":"0cd9b5c2fd02f8eaa2b801f717011236d37e9df0dacd07a38e18b67e4ee6ba97"} Apr 22 19:59:12.461207 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:12.461151 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" podStartSLOduration=2.3305950810000002 podStartE2EDuration="6.461118696s" podCreationTimestamp="2026-04-22 19:59:06 +0000 UTC" firstStartedPulling="2026-04-22 19:59:07.13916877 +0000 UTC m=+68.658061788" lastFinishedPulling="2026-04-22 19:59:11.269692374 +0000 UTC m=+72.788585403" observedRunningTime="2026-04-22 19:59:12.460059645 +0000 UTC m=+73.978952680" watchObservedRunningTime="2026-04-22 19:59:12.461118696 +0000 UTC m=+73.980011730" Apr 22 19:59:13.434825 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.434788 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" event={"ID":"46272374-b555-4cc1-bf00-d09835122abb","Type":"ContainerStarted","Data":"1ed02a4b42369f31d11e00ecd7b25602ef18f1d7c413f7218872f3a9782f1139"} Apr 22 19:59:13.435120 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.434843 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" event={"ID":"46272374-b555-4cc1-bf00-d09835122abb","Type":"ContainerStarted","Data":"888a36a058cdb6dbf5aaed862f36a0c751a3e4aee8441d59cb6196f3205957e6"} Apr 22 19:59:13.435120 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.434859 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" event={"ID":"46272374-b555-4cc1-bf00-d09835122abb","Type":"ContainerStarted","Data":"ad1ab6854ee9c99106e030a9322ceda74cbfe3edf1f83f1afad753e67fd44cdd"} Apr 22 19:59:13.438327 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.438304 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerStarted","Data":"0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8"} Apr 22 19:59:13.438441 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.438332 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerStarted","Data":"92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928"} Apr 22 19:59:13.438441 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.438342 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerStarted","Data":"955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944"} Apr 22 19:59:13.438441 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.438350 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerStarted","Data":"595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539"} Apr 22 19:59:13.440096 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.439552 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" event={"ID":"3aae5d5c-3e3d-41f7-8936-de19245f2588","Type":"ContainerStarted","Data":"36d477d1f1e09871e272efc6038d00fa24167f57bf7f76954d2f283fd537d902"} Apr 22 19:59:13.440300 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.440280 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" Apr 22 19:59:13.445547 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.445519 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" Apr 22 19:59:13.458201 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.458147 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5ffdb9456-lf64t" podStartSLOduration=1.6028617920000001 podStartE2EDuration="4.458132578s" podCreationTimestamp="2026-04-22 19:59:09 +0000 UTC" firstStartedPulling="2026-04-22 19:59:10.247714631 +0000 UTC m=+71.766607650" lastFinishedPulling="2026-04-22 19:59:13.102985419 +0000 UTC m=+74.621878436" observedRunningTime="2026-04-22 19:59:13.456170971 +0000 UTC m=+74.975064044" watchObservedRunningTime="2026-04-22 19:59:13.458132578 +0000 UTC m=+74.977025613" Apr 22 19:59:13.474336 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:13.473633 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rt8sg" podStartSLOduration=1.745897828 podStartE2EDuration="4.473618123s" podCreationTimestamp="2026-04-22 19:59:09 +0000 UTC" firstStartedPulling="2026-04-22 19:59:10.373625625 +0000 UTC m=+71.892518638" lastFinishedPulling="2026-04-22 19:59:13.101345906 +0000 UTC m=+74.620238933" observedRunningTime="2026-04-22 19:59:13.47175212 +0000 UTC m=+74.990645156" watchObservedRunningTime="2026-04-22 19:59:13.473618123 +0000 UTC m=+74.992511157" Apr 22 19:59:14.447595 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:14.447560 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerStarted","Data":"012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f"} Apr 22 19:59:14.448080 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:14.447608 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerStarted","Data":"7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a"} Apr 22 19:59:14.474788 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:14.474727 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.34154042 podStartE2EDuration="9.474709616s" podCreationTimestamp="2026-04-22 19:59:05 +0000 UTC" firstStartedPulling="2026-04-22 19:59:06.971700773 +0000 UTC m=+68.490593800" lastFinishedPulling="2026-04-22 19:59:13.104869978 +0000 UTC m=+74.623762996" observedRunningTime="2026-04-22 19:59:14.472690231 +0000 UTC m=+75.991583267" watchObservedRunningTime="2026-04-22 19:59:14.474709616 +0000 UTC m=+75.993602652" Apr 22 19:59:16.456073 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:16.456037 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerStarted","Data":"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822"} Apr 22 19:59:16.456462 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:16.456081 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerStarted","Data":"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200"} Apr 22 19:59:16.456462 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:16.456096 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerStarted","Data":"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470"} Apr 22 19:59:16.456462 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:16.456110 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerStarted","Data":"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860"} Apr 22 19:59:16.456462 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:16.456123 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerStarted","Data":"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333"} Apr 22 19:59:16.456462 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:16.456134 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerStarted","Data":"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1"} Apr 22 19:59:16.487922 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:16.487858 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.332141714 podStartE2EDuration="6.487842141s" podCreationTimestamp="2026-04-22 19:59:10 +0000 UTC" firstStartedPulling="2026-04-22 19:59:12.43114167 +0000 UTC m=+73.950034684" lastFinishedPulling="2026-04-22 19:59:15.586842097 +0000 UTC m=+77.105735111" observedRunningTime="2026-04-22 19:59:16.486846779 +0000 UTC m=+78.005739849" watchObservedRunningTime="2026-04-22 19:59:16.487842141 +0000 UTC m=+78.006735177" Apr 22 19:59:18.445873 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:18.445841 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7b4fdb9788-4hxj7" Apr 22 19:59:21.102854 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:21.102818 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 19:59:29.251350 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:29.251313 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:29.251350 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:29.251356 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:33.546115 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:33.546086 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wvnxp_ca734b69-52bc-4ae7-9171-1860a1388b9f/serve-healthcheck-canary/0.log" Apr 22 19:59:38.376542 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:38.376510 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-l4pll" Apr 22 19:59:49.256625 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:49.256589 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 19:59:49.260566 ip-10-0-131-194 kubenswrapper[2580]: I0422 19:59:49.260541 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5548c4d88b-9bwd8" Apr 22 20:00:11.103117 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:11.103065 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:11.123714 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:11.123688 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:11.639865 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:11.639836 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:24.870144 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:24.870097 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:24.870794 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:24.870737 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="alertmanager" containerID="cri-o://595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539" gracePeriod=120 Apr 22 20:00:24.870910 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:24.870785 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy-metric" containerID="cri-o://7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a" gracePeriod=120 Apr 22 20:00:24.870910 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:24.870802 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy-web" containerID="cri-o://92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928" gracePeriod=120 Apr 22 20:00:24.870910 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:24.870847 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy" containerID="cri-o://0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8" gracePeriod=120 Apr 22 20:00:24.870910 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:24.870878 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="config-reloader" containerID="cri-o://955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944" gracePeriod=120 Apr 22 20:00:24.871074 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:24.870845 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="prom-label-proxy" containerID="cri-o://012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f" gracePeriod=120 Apr 22 20:00:25.667884 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:25.667849 2580 generic.go:358] "Generic (PLEG): container finished" podID="b5b894a9-0010-423c-9537-f336848aa436" containerID="012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f" exitCode=0 Apr 22 20:00:25.667884 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:25.667877 2580 generic.go:358] "Generic (PLEG): container finished" podID="b5b894a9-0010-423c-9537-f336848aa436" containerID="0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8" exitCode=0 Apr 22 20:00:25.667884 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:25.667884 2580 generic.go:358] "Generic (PLEG): container finished" podID="b5b894a9-0010-423c-9537-f336848aa436" containerID="955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944" exitCode=0 Apr 22 20:00:25.667884 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:25.667889 2580 generic.go:358] "Generic (PLEG): container finished" podID="b5b894a9-0010-423c-9537-f336848aa436" containerID="595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539" exitCode=0 Apr 22 20:00:25.668147 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:25.667923 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerDied","Data":"012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f"} Apr 22 20:00:25.668147 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:25.667957 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerDied","Data":"0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8"} Apr 22 20:00:25.668147 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:25.667967 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerDied","Data":"955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944"} Apr 22 20:00:25.668147 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:25.667976 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerDied","Data":"595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539"} Apr 22 20:00:26.112433 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.112404 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.245568 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245495 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-main-tls\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.245568 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245534 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-config-out\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.245568 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245568 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-cluster-tls-config\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.245842 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245596 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9klp\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-kube-api-access-w9klp\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.245842 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245657 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-metric\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.245842 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245711 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-tls-assets\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.245842 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245741 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-trusted-ca-bundle\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.245842 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245775 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-metrics-client-ca\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.245842 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245803 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-config-volume\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.246333 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245835 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.246333 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245886 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-web-config\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.246333 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245916 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-main-db\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.246333 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.245953 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-web\") pod \"b5b894a9-0010-423c-9537-f336848aa436\" (UID: \"b5b894a9-0010-423c-9537-f336848aa436\") " Apr 22 20:00:26.247384 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.247332 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:26.247384 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.247347 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:26.247716 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.247694 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:26.249375 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.249333 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-config-out" (OuterVolumeSpecName: "config-out") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:26.249482 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.249434 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-kube-api-access-w9klp" (OuterVolumeSpecName: "kube-api-access-w9klp") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "kube-api-access-w9klp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:26.249545 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.249515 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:26.249607 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.249571 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:26.250030 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.250005 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-config-volume" (OuterVolumeSpecName: "config-volume") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:26.250230 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.250212 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:26.250328 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.250306 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:26.250850 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.250829 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:26.254727 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.254707 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:26.260203 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.260179 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-web-config" (OuterVolumeSpecName: "web-config") pod "b5b894a9-0010-423c-9537-f336848aa436" (UID: "b5b894a9-0010-423c-9537-f336848aa436"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:26.347505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347461 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-tls-assets\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347501 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347515 2580 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5b894a9-0010-423c-9537-f336848aa436-metrics-client-ca\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347529 2580 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-config-volume\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347540 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347552 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-web-config\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347563 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-alertmanager-main-db\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347574 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347586 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-main-tls\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347598 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b5b894a9-0010-423c-9537-f336848aa436-config-out\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347609 2580 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-cluster-tls-config\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347622 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9klp\" (UniqueName: \"kubernetes.io/projected/b5b894a9-0010-423c-9537-f336848aa436-kube-api-access-w9klp\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.347786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.347644 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b5b894a9-0010-423c-9537-f336848aa436-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:26.673141 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.673051 2580 generic.go:358] "Generic (PLEG): container finished" podID="b5b894a9-0010-423c-9537-f336848aa436" containerID="7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a" exitCode=0 Apr 22 20:00:26.673141 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.673084 2580 generic.go:358] "Generic (PLEG): container finished" podID="b5b894a9-0010-423c-9537-f336848aa436" containerID="92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928" exitCode=0 Apr 22 20:00:26.673346 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.673138 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerDied","Data":"7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a"} Apr 22 20:00:26.673346 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.673182 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerDied","Data":"92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928"} Apr 22 20:00:26.673346 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.673195 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.673346 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.673208 2580 scope.go:117] "RemoveContainer" containerID="012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f" Apr 22 20:00:26.673346 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.673197 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b5b894a9-0010-423c-9537-f336848aa436","Type":"ContainerDied","Data":"b8131c6f579c2f907dd7d20de0d640446e0d68e11d8d7350e728d6636c831b7b"} Apr 22 20:00:26.681462 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.681445 2580 scope.go:117] "RemoveContainer" containerID="7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a" Apr 22 20:00:26.688350 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.688331 2580 scope.go:117] "RemoveContainer" containerID="0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8" Apr 22 20:00:26.697181 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.697163 2580 scope.go:117] "RemoveContainer" containerID="92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928" Apr 22 20:00:26.699003 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.698980 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:26.702169 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.702143 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:26.705326 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.705307 2580 scope.go:117] "RemoveContainer" containerID="955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944" Apr 22 20:00:26.712188 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.712171 2580 scope.go:117] "RemoveContainer" containerID="595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539" Apr 22 20:00:26.719063 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.719049 2580 scope.go:117] "RemoveContainer" containerID="2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7" Apr 22 20:00:26.725675 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.725660 2580 scope.go:117] "RemoveContainer" containerID="012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f" Apr 22 20:00:26.725953 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:26.725936 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f\": container with ID starting with 012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f not found: ID does not exist" containerID="012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f" Apr 22 20:00:26.726003 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.725966 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f"} err="failed to get container status \"012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f\": rpc error: code = NotFound desc = could not find container \"012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f\": container with ID starting with 012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f not found: ID does not exist" Apr 22 20:00:26.726044 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.726005 2580 scope.go:117] "RemoveContainer" containerID="7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a" Apr 22 20:00:26.726230 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:26.726213 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a\": container with ID starting with 7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a not found: ID does not exist" containerID="7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a" Apr 22 20:00:26.726353 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.726238 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a"} err="failed to get container status \"7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a\": rpc error: code = NotFound desc = could not find container \"7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a\": container with ID starting with 7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a not found: ID does not exist" Apr 22 20:00:26.726353 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.726300 2580 scope.go:117] "RemoveContainer" containerID="0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8" Apr 22 20:00:26.726526 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:26.726511 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8\": container with ID starting with 0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8 not found: ID does not exist" containerID="0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8" Apr 22 20:00:26.726563 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.726531 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8"} err="failed to get container status \"0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8\": rpc error: code = NotFound desc = could not find container \"0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8\": container with ID starting with 0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8 not found: ID does not exist" Apr 22 20:00:26.726563 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.726546 2580 scope.go:117] "RemoveContainer" containerID="92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928" Apr 22 20:00:26.726775 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:26.726759 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928\": container with ID starting with 92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928 not found: ID does not exist" containerID="92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928" Apr 22 20:00:26.726826 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.726777 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928"} err="failed to get container status \"92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928\": rpc error: code = NotFound desc = could not find container \"92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928\": container with ID starting with 92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928 not found: ID does not exist" Apr 22 20:00:26.726826 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.726789 2580 scope.go:117] "RemoveContainer" containerID="955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944" Apr 22 20:00:26.727017 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:26.727001 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944\": container with ID starting with 955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944 not found: ID does not exist" containerID="955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944" Apr 22 20:00:26.727055 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727022 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944"} err="failed to get container status \"955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944\": rpc error: code = NotFound desc = could not find container \"955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944\": container with ID starting with 955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944 not found: ID does not exist" Apr 22 20:00:26.727055 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727034 2580 scope.go:117] "RemoveContainer" containerID="595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539" Apr 22 20:00:26.727241 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:26.727227 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539\": container with ID starting with 595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539 not found: ID does not exist" containerID="595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539" Apr 22 20:00:26.727430 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727245 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539"} err="failed to get container status \"595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539\": rpc error: code = NotFound desc = could not find container \"595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539\": container with ID starting with 595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539 not found: ID does not exist" Apr 22 20:00:26.727430 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727300 2580 scope.go:117] "RemoveContainer" containerID="2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7" Apr 22 20:00:26.727523 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:26.727505 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7\": container with ID starting with 2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7 not found: ID does not exist" containerID="2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7" Apr 22 20:00:26.727559 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727520 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7"} err="failed to get container status \"2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7\": rpc error: code = NotFound desc = could not find container \"2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7\": container with ID starting with 2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7 not found: ID does not exist" Apr 22 20:00:26.727559 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727531 2580 scope.go:117] "RemoveContainer" containerID="012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f" Apr 22 20:00:26.727745 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727725 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f"} err="failed to get container status \"012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f\": rpc error: code = NotFound desc = could not find container \"012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f\": container with ID starting with 012df0f65158020155ba19ca63cac5b6d674eb23aa91747892118cd72d15526f not found: ID does not exist" Apr 22 20:00:26.727786 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727745 2580 scope.go:117] "RemoveContainer" containerID="7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a" Apr 22 20:00:26.727918 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727902 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a"} err="failed to get container status \"7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a\": rpc error: code = NotFound desc = could not find container \"7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a\": container with ID starting with 7ceacd919d7e450ab8b10d2278c55a757a706a9bea24451e0a8a905ec476396a not found: ID does not exist" Apr 22 20:00:26.727958 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.727919 2580 scope.go:117] "RemoveContainer" containerID="0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8" Apr 22 20:00:26.728095 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.728080 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8"} err="failed to get container status \"0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8\": rpc error: code = NotFound desc = could not find container \"0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8\": container with ID starting with 0242a894d86690b5922520a634ef88a536be05ed44c3f28f483cbe46f9f068b8 not found: ID does not exist" Apr 22 20:00:26.728131 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.728095 2580 scope.go:117] "RemoveContainer" containerID="92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928" Apr 22 20:00:26.728288 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.728273 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928"} err="failed to get container status \"92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928\": rpc error: code = NotFound desc = could not find container \"92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928\": container with ID starting with 92af87ad607ac9411133256cba57f8825fd028d279a846db9de6acb4e8715928 not found: ID does not exist" Apr 22 20:00:26.728348 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.728289 2580 scope.go:117] "RemoveContainer" containerID="955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944" Apr 22 20:00:26.728612 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.728566 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944"} err="failed to get container status \"955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944\": rpc error: code = NotFound desc = could not find container \"955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944\": container with ID starting with 955c3eb4c2e0ecfe0660cb89c82ef624b77a17ee0b4197a9a9c00460cd6d0944 not found: ID does not exist" Apr 22 20:00:26.728676 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.728634 2580 scope.go:117] "RemoveContainer" containerID="595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539" Apr 22 20:00:26.729311 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.729097 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539"} err="failed to get container status \"595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539\": rpc error: code = NotFound desc = could not find container \"595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539\": container with ID starting with 595c33b191d0a49fb202f64022486f30a7688906fffe0878b464ebc4edc66539 not found: ID does not exist" Apr 22 20:00:26.729311 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.729122 2580 scope.go:117] "RemoveContainer" containerID="2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7" Apr 22 20:00:26.729556 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.729529 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7"} err="failed to get container status \"2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7\": rpc error: code = NotFound desc = could not find container \"2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7\": container with ID starting with 2d1a9384d773c3cd0bbadb606e542fa724f8de58fb5c34b3b9d5c5ca9bc9c3f7 not found: ID does not exist" Apr 22 20:00:26.730641 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.730620 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:26.730954 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.730940 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy" Apr 22 20:00:26.730954 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.730955 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.730966 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy-web" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.730972 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy-web" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.730981 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="prom-label-proxy" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.730986 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="prom-label-proxy" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.730998 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="init-config-reloader" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731003 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="init-config-reloader" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731010 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy-metric" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731015 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy-metric" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731022 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="alertmanager" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731028 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="alertmanager" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731037 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="config-reloader" Apr 22 20:00:26.731066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731042 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="config-reloader" Apr 22 20:00:26.731484 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731086 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy-web" Apr 22 20:00:26.731484 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731095 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy-metric" Apr 22 20:00:26.731484 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731102 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="config-reloader" Apr 22 20:00:26.731484 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731109 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="prom-label-proxy" Apr 22 20:00:26.731484 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731115 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="kube-rbac-proxy" Apr 22 20:00:26.731484 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.731122 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b894a9-0010-423c-9537-f336848aa436" containerName="alertmanager" Apr 22 20:00:26.736355 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.736337 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.738888 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.738865 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 20:00:26.739006 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.738874 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 20:00:26.739006 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.738937 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2pj74\"" Apr 22 20:00:26.739861 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.739841 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 20:00:26.739970 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.739878 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 20:00:26.739970 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.739904 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 20:00:26.739970 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.739946 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 20:00:26.740133 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.739908 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 20:00:26.740311 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.740294 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 20:00:26.745094 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.745076 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 20:00:26.749806 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.749785 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:26.852105 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852068 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ac443213-2dbe-46bd-8c27-4653e0c88871-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852105 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852110 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac443213-2dbe-46bd-8c27-4653e0c88871-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852359 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852135 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852359 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852156 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852359 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852178 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-web-config\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852359 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852194 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-config-volume\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852359 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852216 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnbht\" (UniqueName: \"kubernetes.io/projected/ac443213-2dbe-46bd-8c27-4653e0c88871-kube-api-access-tnbht\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852359 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852303 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ac443213-2dbe-46bd-8c27-4653e0c88871-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852584 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852362 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852584 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852389 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac443213-2dbe-46bd-8c27-4653e0c88871-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852584 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852414 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ac443213-2dbe-46bd-8c27-4653e0c88871-config-out\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852584 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852430 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.852584 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.852446 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953080 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953044 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ac443213-2dbe-46bd-8c27-4653e0c88871-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953209 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953085 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac443213-2dbe-46bd-8c27-4653e0c88871-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953209 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953108 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953209 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953128 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953209 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953157 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-web-config\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953209 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-config-volume\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953459 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953216 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnbht\" (UniqueName: \"kubernetes.io/projected/ac443213-2dbe-46bd-8c27-4653e0c88871-kube-api-access-tnbht\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953459 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953242 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ac443213-2dbe-46bd-8c27-4653e0c88871-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953459 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953331 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953459 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953369 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac443213-2dbe-46bd-8c27-4653e0c88871-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953459 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953406 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ac443213-2dbe-46bd-8c27-4653e0c88871-config-out\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953459 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953429 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953459 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953456 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953767 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953499 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ac443213-2dbe-46bd-8c27-4653e0c88871-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.953948 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.953910 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ac443213-2dbe-46bd-8c27-4653e0c88871-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.955134 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.955104 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac443213-2dbe-46bd-8c27-4653e0c88871-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.956584 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.956468 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ac443213-2dbe-46bd-8c27-4653e0c88871-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.956871 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.956790 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-web-config\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.956871 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.956790 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ac443213-2dbe-46bd-8c27-4653e0c88871-config-out\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.957008 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.956923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-config-volume\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.957008 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.956946 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.957116 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.957075 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.957286 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.957269 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.957823 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.957800 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.958778 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.958761 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ac443213-2dbe-46bd-8c27-4653e0c88871-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:26.961756 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:26.961739 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnbht\" (UniqueName: \"kubernetes.io/projected/ac443213-2dbe-46bd-8c27-4653e0c88871-kube-api-access-tnbht\") pod \"alertmanager-main-0\" (UID: \"ac443213-2dbe-46bd-8c27-4653e0c88871\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:27.049515 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:27.049482 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 20:00:27.068633 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:27.068600 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b894a9-0010-423c-9537-f336848aa436" path="/var/lib/kubelet/pods/b5b894a9-0010-423c-9537-f336848aa436/volumes" Apr 22 20:00:27.193098 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:27.193013 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 20:00:27.197010 ip-10-0-131-194 kubenswrapper[2580]: W0422 20:00:27.196984 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac443213_2dbe_46bd_8c27_4653e0c88871.slice/crio-edbe2fd337e2e0cb1a74d19a95bae7aa0f362bb62301d2a1bc6d860c584e80e5 WatchSource:0}: Error finding container edbe2fd337e2e0cb1a74d19a95bae7aa0f362bb62301d2a1bc6d860c584e80e5: Status 404 returned error can't find the container with id edbe2fd337e2e0cb1a74d19a95bae7aa0f362bb62301d2a1bc6d860c584e80e5 Apr 22 20:00:27.678857 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:27.678773 2580 generic.go:358] "Generic (PLEG): container finished" podID="ac443213-2dbe-46bd-8c27-4653e0c88871" containerID="2a45cd417c4f5a0a25700b4eddc6c59bcb1dfe67df2cd50f9aac50105c0967cc" exitCode=0 Apr 22 20:00:27.678857 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:27.678838 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ac443213-2dbe-46bd-8c27-4653e0c88871","Type":"ContainerDied","Data":"2a45cd417c4f5a0a25700b4eddc6c59bcb1dfe67df2cd50f9aac50105c0967cc"} Apr 22 20:00:27.679034 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:27.678879 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ac443213-2dbe-46bd-8c27-4653e0c88871","Type":"ContainerStarted","Data":"edbe2fd337e2e0cb1a74d19a95bae7aa0f362bb62301d2a1bc6d860c584e80e5"} Apr 22 20:00:28.684907 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:28.684874 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ac443213-2dbe-46bd-8c27-4653e0c88871","Type":"ContainerStarted","Data":"c082abfb3099c63a9979bd27452ca65bcf1af0b79bb8ce5065bb181bcc23e7eb"} Apr 22 20:00:28.684907 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:28.684908 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ac443213-2dbe-46bd-8c27-4653e0c88871","Type":"ContainerStarted","Data":"b82b043d916229d468eae8d96b06a9977bf1a88d8a94a830aeedbe3f6ff4862f"} Apr 22 20:00:28.685344 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:28.684921 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ac443213-2dbe-46bd-8c27-4653e0c88871","Type":"ContainerStarted","Data":"a18f1ef08c9f2d98434b7ce719cdd9ca1d69385e681feb879fb36bb85e0a449a"} Apr 22 20:00:28.685344 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:28.684929 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ac443213-2dbe-46bd-8c27-4653e0c88871","Type":"ContainerStarted","Data":"b2ec8cd33a7691212f635689c5cde56c0981cfc262c2a8d7c4f91f42fe68d5fb"} Apr 22 20:00:28.685344 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:28.684938 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ac443213-2dbe-46bd-8c27-4653e0c88871","Type":"ContainerStarted","Data":"d15ea56c9d66ee996e711ea2f0957d6d72b1d3ff8217c0304eae391e54367091"} Apr 22 20:00:28.685344 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:28.684946 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ac443213-2dbe-46bd-8c27-4653e0c88871","Type":"ContainerStarted","Data":"27364a098fd71d441f95b6d3605a25fb01482fb07e1cbcb4799cf58bd94ffab3"} Apr 22 20:00:28.712867 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:28.712804 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.7127843499999997 podStartE2EDuration="2.71278435s" podCreationTimestamp="2026-04-22 20:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:28.710402982 +0000 UTC m=+150.229296017" watchObservedRunningTime="2026-04-22 20:00:28.71278435 +0000 UTC m=+150.231677397" Apr 22 20:00:29.088101 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.088003 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:29.088758 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.088491 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="prometheus" containerID="cri-o://689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" gracePeriod=600 Apr 22 20:00:29.088758 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.088518 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy" containerID="cri-o://f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" gracePeriod=600 Apr 22 20:00:29.088758 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.088553 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy-thanos" containerID="cri-o://1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" gracePeriod=600 Apr 22 20:00:29.088758 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.088566 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy-web" containerID="cri-o://bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" gracePeriod=600 Apr 22 20:00:29.088758 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.088715 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="config-reloader" containerID="cri-o://211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" gracePeriod=600 Apr 22 20:00:29.089088 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.088525 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="thanos-sidecar" containerID="cri-o://9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" gracePeriod=600 Apr 22 20:00:29.441572 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.441549 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.477128 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477089 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-tls\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477128 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477130 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-metrics-client-ca\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477402 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477179 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-metrics-client-certs\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477402 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477205 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-grpc-tls\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477402 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477233 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477402 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477277 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config-out\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477402 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477301 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-kube-rbac-proxy\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477402 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477346 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-rulefiles-0\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477402 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477374 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-serving-certs-ca-bundle\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477742 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477421 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-db\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477742 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477448 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477742 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477478 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmq47\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-kube-api-access-jmq47\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477742 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477535 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-tls-assets\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477742 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477563 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-thanos-prometheus-http-client-file\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477742 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477604 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-trusted-ca-bundle\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477742 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477646 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-kubelet-serving-ca-bundle\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477742 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477687 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.477742 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.477711 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-web-config\") pod \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\" (UID: \"cd6b8d80-91b9-4b46-8aba-4bc5da59a846\") " Apr 22 20:00:29.478351 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.478083 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:29.479301 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.479155 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:29.479301 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.479193 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:29.481092 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.481040 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:29.481346 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.481317 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:29.483122 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.483087 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:29.483208 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.483113 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:29.483574 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.483538 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config" (OuterVolumeSpecName: "config") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:29.483843 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.483812 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:29.484282 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.484238 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:29.484701 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.484671 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:29.484824 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.484783 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:29.485060 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.485016 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:29.485282 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.485220 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:29.485475 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.485450 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-kube-api-access-jmq47" (OuterVolumeSpecName: "kube-api-access-jmq47") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "kube-api-access-jmq47". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:29.485953 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.485916 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:29.486173 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.486151 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config-out" (OuterVolumeSpecName: "config-out") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:29.495588 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.495553 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-web-config" (OuterVolumeSpecName: "web-config") pod "cd6b8d80-91b9-4b46-8aba-4bc5da59a846" (UID: "cd6b8d80-91b9-4b46-8aba-4bc5da59a846"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:29.579189 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579150 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579189 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579183 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-metrics-client-ca\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579189 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579195 2580 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-metrics-client-certs\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579205 2580 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-grpc-tls\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579216 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579225 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config-out\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579234 2580 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-kube-rbac-proxy\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579243 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579274 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579285 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-k8s-db\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579294 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579302 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jmq47\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-kube-api-access-jmq47\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579311 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-tls-assets\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579319 2580 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579327 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579335 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579344 2580 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-config\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.579445 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.579352 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd6b8d80-91b9-4b46-8aba-4bc5da59a846-web-config\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:00:29.690706 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690668 2580 generic.go:358] "Generic (PLEG): container finished" podID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerID="1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" exitCode=0 Apr 22 20:00:29.690706 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690698 2580 generic.go:358] "Generic (PLEG): container finished" podID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerID="f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" exitCode=0 Apr 22 20:00:29.690706 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690707 2580 generic.go:358] "Generic (PLEG): container finished" podID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerID="bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" exitCode=0 Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690716 2580 generic.go:358] "Generic (PLEG): container finished" podID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerID="9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" exitCode=0 Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690723 2580 generic.go:358] "Generic (PLEG): container finished" podID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerID="211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" exitCode=0 Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690731 2580 generic.go:358] "Generic (PLEG): container finished" podID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerID="689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" exitCode=0 Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690752 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerDied","Data":"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822"} Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690792 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerDied","Data":"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200"} Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690805 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerDied","Data":"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470"} Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690808 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690816 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerDied","Data":"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860"} Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690827 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerDied","Data":"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333"} Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690839 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerDied","Data":"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1"} Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690853 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cd6b8d80-91b9-4b46-8aba-4bc5da59a846","Type":"ContainerDied","Data":"0cd9b5c2fd02f8eaa2b801f717011236d37e9df0dacd07a38e18b67e4ee6ba97"} Apr 22 20:00:29.691210 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.690863 2580 scope.go:117] "RemoveContainer" containerID="1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" Apr 22 20:00:29.701321 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.701191 2580 scope.go:117] "RemoveContainer" containerID="f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" Apr 22 20:00:29.708362 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.708342 2580 scope.go:117] "RemoveContainer" containerID="bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" Apr 22 20:00:29.715281 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.715233 2580 scope.go:117] "RemoveContainer" containerID="9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" Apr 22 20:00:29.716882 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.716863 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:29.721159 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.721136 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:29.723298 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.723285 2580 scope.go:117] "RemoveContainer" containerID="211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" Apr 22 20:00:29.729776 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.729761 2580 scope.go:117] "RemoveContainer" containerID="689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" Apr 22 20:00:29.736720 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.736704 2580 scope.go:117] "RemoveContainer" containerID="f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513" Apr 22 20:00:29.743414 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.743397 2580 scope.go:117] "RemoveContainer" containerID="1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" Apr 22 20:00:29.743661 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:29.743643 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": container with ID starting with 1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822 not found: ID does not exist" containerID="1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" Apr 22 20:00:29.743715 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.743668 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822"} err="failed to get container status \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": rpc error: code = NotFound desc = could not find container \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": container with ID starting with 1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822 not found: ID does not exist" Apr 22 20:00:29.743715 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.743686 2580 scope.go:117] "RemoveContainer" containerID="f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" Apr 22 20:00:29.743922 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:29.743905 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": container with ID starting with f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200 not found: ID does not exist" containerID="f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" Apr 22 20:00:29.743963 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.743928 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200"} err="failed to get container status \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": rpc error: code = NotFound desc = could not find container \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": container with ID starting with f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200 not found: ID does not exist" Apr 22 20:00:29.743963 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.743946 2580 scope.go:117] "RemoveContainer" containerID="bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" Apr 22 20:00:29.744138 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:29.744121 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": container with ID starting with bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470 not found: ID does not exist" containerID="bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" Apr 22 20:00:29.744173 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.744141 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470"} err="failed to get container status \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": rpc error: code = NotFound desc = could not find container \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": container with ID starting with bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470 not found: ID does not exist" Apr 22 20:00:29.744173 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.744154 2580 scope.go:117] "RemoveContainer" containerID="9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" Apr 22 20:00:29.744363 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:29.744343 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": container with ID starting with 9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860 not found: ID does not exist" containerID="9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" Apr 22 20:00:29.744400 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.744383 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860"} err="failed to get container status \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": rpc error: code = NotFound desc = could not find container \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": container with ID starting with 9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860 not found: ID does not exist" Apr 22 20:00:29.744400 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.744396 2580 scope.go:117] "RemoveContainer" containerID="211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" Apr 22 20:00:29.744631 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:29.744615 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": container with ID starting with 211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333 not found: ID does not exist" containerID="211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" Apr 22 20:00:29.744673 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.744636 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333"} err="failed to get container status \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": rpc error: code = NotFound desc = could not find container \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": container with ID starting with 211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333 not found: ID does not exist" Apr 22 20:00:29.744673 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.744653 2580 scope.go:117] "RemoveContainer" containerID="689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" Apr 22 20:00:29.744861 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:29.744845 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": container with ID starting with 689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1 not found: ID does not exist" containerID="689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" Apr 22 20:00:29.744900 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.744866 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1"} err="failed to get container status \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": rpc error: code = NotFound desc = could not find container \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": container with ID starting with 689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1 not found: ID does not exist" Apr 22 20:00:29.744900 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.744880 2580 scope.go:117] "RemoveContainer" containerID="f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513" Apr 22 20:00:29.745105 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:00:29.745090 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": container with ID starting with f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513 not found: ID does not exist" containerID="f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513" Apr 22 20:00:29.745146 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.745109 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513"} err="failed to get container status \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": rpc error: code = NotFound desc = could not find container \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": container with ID starting with f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513 not found: ID does not exist" Apr 22 20:00:29.745146 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.745131 2580 scope.go:117] "RemoveContainer" containerID="1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" Apr 22 20:00:29.745389 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.745371 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822"} err="failed to get container status \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": rpc error: code = NotFound desc = could not find container \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": container with ID starting with 1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822 not found: ID does not exist" Apr 22 20:00:29.745443 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.745389 2580 scope.go:117] "RemoveContainer" containerID="f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" Apr 22 20:00:29.745662 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.745604 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200"} err="failed to get container status \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": rpc error: code = NotFound desc = could not find container \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": container with ID starting with f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200 not found: ID does not exist" Apr 22 20:00:29.745662 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.745633 2580 scope.go:117] "RemoveContainer" containerID="bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" Apr 22 20:00:29.746338 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.746042 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470"} err="failed to get container status \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": rpc error: code = NotFound desc = could not find container \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": container with ID starting with bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470 not found: ID does not exist" Apr 22 20:00:29.746338 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.746072 2580 scope.go:117] "RemoveContainer" containerID="9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" Apr 22 20:00:29.746518 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.746352 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860"} err="failed to get container status \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": rpc error: code = NotFound desc = could not find container \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": container with ID starting with 9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860 not found: ID does not exist" Apr 22 20:00:29.746518 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.746372 2580 scope.go:117] "RemoveContainer" containerID="211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" Apr 22 20:00:29.746661 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.746614 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333"} err="failed to get container status \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": rpc error: code = NotFound desc = could not find container \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": container with ID starting with 211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333 not found: ID does not exist" Apr 22 20:00:29.746705 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.746664 2580 scope.go:117] "RemoveContainer" containerID="689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" Apr 22 20:00:29.746958 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.746934 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1"} err="failed to get container status \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": rpc error: code = NotFound desc = could not find container \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": container with ID starting with 689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1 not found: ID does not exist" Apr 22 20:00:29.747052 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.746964 2580 scope.go:117] "RemoveContainer" containerID="f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513" Apr 22 20:00:29.747212 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.747195 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513"} err="failed to get container status \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": rpc error: code = NotFound desc = could not find container \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": container with ID starting with f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513 not found: ID does not exist" Apr 22 20:00:29.747287 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.747212 2580 scope.go:117] "RemoveContainer" containerID="1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" Apr 22 20:00:29.747422 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.747405 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822"} err="failed to get container status \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": rpc error: code = NotFound desc = could not find container \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": container with ID starting with 1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822 not found: ID does not exist" Apr 22 20:00:29.747422 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.747421 2580 scope.go:117] "RemoveContainer" containerID="f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" Apr 22 20:00:29.747646 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.747620 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200"} err="failed to get container status \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": rpc error: code = NotFound desc = could not find container \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": container with ID starting with f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200 not found: ID does not exist" Apr 22 20:00:29.747757 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.747647 2580 scope.go:117] "RemoveContainer" containerID="bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" Apr 22 20:00:29.747834 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.747818 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:29.747894 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.747879 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470"} err="failed to get container status \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": rpc error: code = NotFound desc = could not find container \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": container with ID starting with bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470 not found: ID does not exist" Apr 22 20:00:29.747936 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.747894 2580 scope.go:117] "RemoveContainer" containerID="9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" Apr 22 20:00:29.748120 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748103 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860"} err="failed to get container status \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": rpc error: code = NotFound desc = could not find container \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": container with ID starting with 9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860 not found: ID does not exist" Apr 22 20:00:29.748120 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748119 2580 scope.go:117] "RemoveContainer" containerID="211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" Apr 22 20:00:29.748235 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748161 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy" Apr 22 20:00:29.748235 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748174 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy" Apr 22 20:00:29.748235 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748189 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="init-config-reloader" Apr 22 20:00:29.748235 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748194 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="init-config-reloader" Apr 22 20:00:29.748235 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748213 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="prometheus" Apr 22 20:00:29.748235 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748219 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="prometheus" Apr 22 20:00:29.748235 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748231 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="config-reloader" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748240 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="config-reloader" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748264 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy-thanos" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748273 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy-thanos" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748285 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="thanos-sidecar" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748293 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="thanos-sidecar" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748300 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy-web" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748305 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy-web" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748362 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748365 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333"} err="failed to get container status \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": rpc error: code = NotFound desc = could not find container \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": container with ID starting with 211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333 not found: ID does not exist" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748385 2580 scope.go:117] "RemoveContainer" containerID="689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748372 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="prometheus" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748424 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy-web" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748434 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="kube-rbac-proxy-thanos" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748444 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="config-reloader" Apr 22 20:00:29.748505 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748451 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" containerName="thanos-sidecar" Apr 22 20:00:29.749003 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748647 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1"} err="failed to get container status \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": rpc error: code = NotFound desc = could not find container \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": container with ID starting with 689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1 not found: ID does not exist" Apr 22 20:00:29.749003 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748669 2580 scope.go:117] "RemoveContainer" containerID="f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513" Apr 22 20:00:29.749003 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748905 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513"} err="failed to get container status \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": rpc error: code = NotFound desc = could not find container \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": container with ID starting with f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513 not found: ID does not exist" Apr 22 20:00:29.749003 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.748925 2580 scope.go:117] "RemoveContainer" containerID="1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" Apr 22 20:00:29.749167 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.749147 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822"} err="failed to get container status \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": rpc error: code = NotFound desc = could not find container \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": container with ID starting with 1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822 not found: ID does not exist" Apr 22 20:00:29.749207 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.749169 2580 scope.go:117] "RemoveContainer" containerID="f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" Apr 22 20:00:29.749431 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.749413 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200"} err="failed to get container status \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": rpc error: code = NotFound desc = could not find container \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": container with ID starting with f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200 not found: ID does not exist" Apr 22 20:00:29.749431 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.749431 2580 scope.go:117] "RemoveContainer" containerID="bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" Apr 22 20:00:29.749643 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.749627 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470"} err="failed to get container status \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": rpc error: code = NotFound desc = could not find container \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": container with ID starting with bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470 not found: ID does not exist" Apr 22 20:00:29.749643 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.749642 2580 scope.go:117] "RemoveContainer" containerID="9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" Apr 22 20:00:29.749846 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.749824 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860"} err="failed to get container status \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": rpc error: code = NotFound desc = could not find container \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": container with ID starting with 9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860 not found: ID does not exist" Apr 22 20:00:29.749997 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.749850 2580 scope.go:117] "RemoveContainer" containerID="211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" Apr 22 20:00:29.750120 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.750095 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333"} err="failed to get container status \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": rpc error: code = NotFound desc = could not find container \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": container with ID starting with 211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333 not found: ID does not exist" Apr 22 20:00:29.750226 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.750121 2580 scope.go:117] "RemoveContainer" containerID="689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" Apr 22 20:00:29.750451 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.750426 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1"} err="failed to get container status \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": rpc error: code = NotFound desc = could not find container \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": container with ID starting with 689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1 not found: ID does not exist" Apr 22 20:00:29.750543 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.750453 2580 scope.go:117] "RemoveContainer" containerID="f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513" Apr 22 20:00:29.750738 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.750716 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513"} err="failed to get container status \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": rpc error: code = NotFound desc = could not find container \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": container with ID starting with f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513 not found: ID does not exist" Apr 22 20:00:29.750830 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.750739 2580 scope.go:117] "RemoveContainer" containerID="1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" Apr 22 20:00:29.751005 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.750986 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822"} err="failed to get container status \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": rpc error: code = NotFound desc = could not find container \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": container with ID starting with 1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822 not found: ID does not exist" Apr 22 20:00:29.751083 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.751007 2580 scope.go:117] "RemoveContainer" containerID="f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" Apr 22 20:00:29.751216 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.751197 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200"} err="failed to get container status \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": rpc error: code = NotFound desc = could not find container \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": container with ID starting with f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200 not found: ID does not exist" Apr 22 20:00:29.751394 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.751217 2580 scope.go:117] "RemoveContainer" containerID="bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" Apr 22 20:00:29.751454 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.751436 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470"} err="failed to get container status \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": rpc error: code = NotFound desc = could not find container \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": container with ID starting with bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470 not found: ID does not exist" Apr 22 20:00:29.751503 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.751456 2580 scope.go:117] "RemoveContainer" containerID="9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" Apr 22 20:00:29.751703 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.751678 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860"} err="failed to get container status \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": rpc error: code = NotFound desc = could not find container \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": container with ID starting with 9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860 not found: ID does not exist" Apr 22 20:00:29.751762 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.751704 2580 scope.go:117] "RemoveContainer" containerID="211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" Apr 22 20:00:29.751911 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.751891 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333"} err="failed to get container status \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": rpc error: code = NotFound desc = could not find container \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": container with ID starting with 211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333 not found: ID does not exist" Apr 22 20:00:29.751962 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.751912 2580 scope.go:117] "RemoveContainer" containerID="689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" Apr 22 20:00:29.752134 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.752115 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1"} err="failed to get container status \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": rpc error: code = NotFound desc = could not find container \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": container with ID starting with 689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1 not found: ID does not exist" Apr 22 20:00:29.752174 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.752136 2580 scope.go:117] "RemoveContainer" containerID="f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513" Apr 22 20:00:29.752362 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.752345 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513"} err="failed to get container status \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": rpc error: code = NotFound desc = could not find container \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": container with ID starting with f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513 not found: ID does not exist" Apr 22 20:00:29.752420 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.752363 2580 scope.go:117] "RemoveContainer" containerID="1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822" Apr 22 20:00:29.752591 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.752573 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822"} err="failed to get container status \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": rpc error: code = NotFound desc = could not find container \"1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822\": container with ID starting with 1c4e65dde14256b4d79b05056c4f798e1edd4248d679612dd4825fc8638eb822 not found: ID does not exist" Apr 22 20:00:29.752635 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.752591 2580 scope.go:117] "RemoveContainer" containerID="f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200" Apr 22 20:00:29.752818 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.752796 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200"} err="failed to get container status \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": rpc error: code = NotFound desc = could not find container \"f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200\": container with ID starting with f9be133e7804c6b43422ddbb2f25ee8ebf1417a2660a96e2aa3826831adb2200 not found: ID does not exist" Apr 22 20:00:29.752818 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.752817 2580 scope.go:117] "RemoveContainer" containerID="bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470" Apr 22 20:00:29.753027 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.753010 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470"} err="failed to get container status \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": rpc error: code = NotFound desc = could not find container \"bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470\": container with ID starting with bfcc3b090cdcdad1f891c28e89a2fb53b58ba91a11697f9202f6a4404dc4d470 not found: ID does not exist" Apr 22 20:00:29.753091 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.753029 2580 scope.go:117] "RemoveContainer" containerID="9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860" Apr 22 20:00:29.753232 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.753216 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860"} err="failed to get container status \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": rpc error: code = NotFound desc = could not find container \"9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860\": container with ID starting with 9c244ea1a5ad8dae026fe7149aa8500c988babe089fb0d8948f4b3b8bccef860 not found: ID does not exist" Apr 22 20:00:29.753329 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.753233 2580 scope.go:117] "RemoveContainer" containerID="211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333" Apr 22 20:00:29.753447 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.753428 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333"} err="failed to get container status \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": rpc error: code = NotFound desc = could not find container \"211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333\": container with ID starting with 211bde93f4e04a4cd49a2b6bee747687a4cd54b0bdb13222021229eb8634d333 not found: ID does not exist" Apr 22 20:00:29.753492 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.753448 2580 scope.go:117] "RemoveContainer" containerID="689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1" Apr 22 20:00:29.753652 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.753634 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1"} err="failed to get container status \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": rpc error: code = NotFound desc = could not find container \"689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1\": container with ID starting with 689812f7ccdde55290c71c8552551c85437dc1c267eecbff0e9e075a87bd91f1 not found: ID does not exist" Apr 22 20:00:29.753709 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.753653 2580 scope.go:117] "RemoveContainer" containerID="f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513" Apr 22 20:00:29.753886 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.753863 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513"} err="failed to get container status \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": rpc error: code = NotFound desc = could not find container \"f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513\": container with ID starting with f538fe904dfb854ddf8ce1905774f52d6ace4b4800f0b39db64ae519a6417513 not found: ID does not exist" Apr 22 20:00:29.754272 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.754237 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.756764 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.756743 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 20:00:29.756856 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.756834 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 20:00:29.757591 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.757574 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 20:00:29.757741 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.757721 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8ivqgt401gq4c\"" Apr 22 20:00:29.757879 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.757865 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 20:00:29.758848 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.758833 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 20:00:29.760782 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.760744 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 20:00:29.760782 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.760765 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 20:00:29.761110 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.760807 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 20:00:29.761110 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.760891 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 20:00:29.761110 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.760749 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mv6l8\"" Apr 22 20:00:29.761110 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.760917 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 20:00:29.763583 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.763558 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 20:00:29.766358 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.766341 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 20:00:29.773335 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.773316 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:29.780828 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.780805 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ed73bd9e-c47b-450d-89f7-48927170ad3b-config-out\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.780828 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.780835 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781085 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781060 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781144 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781114 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-config\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781183 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781163 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781225 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781393 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781441 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781441 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781421 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781590 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781573 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781648 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781618 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781686 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781645 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781733 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781702 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ed73bd9e-c47b-450d-89f7-48927170ad3b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781733 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781723 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ed73bd9e-c47b-450d-89f7-48927170ad3b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781838 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781762 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781838 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781815 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-web-config\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781919 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781850 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.781919 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.781902 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnbtr\" (UniqueName: \"kubernetes.io/projected/ed73bd9e-c47b-450d-89f7-48927170ad3b-kube-api-access-tnbtr\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883207 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883169 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883207 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883209 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-config\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883495 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883227 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883495 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883284 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883495 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883312 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883495 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883338 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883495 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883495 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883450 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883495 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883473 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.883495 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883495 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.884893 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883536 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ed73bd9e-c47b-450d-89f7-48927170ad3b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.884893 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883570 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ed73bd9e-c47b-450d-89f7-48927170ad3b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.884893 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883603 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.884893 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883642 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-web-config\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.884893 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883669 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.884893 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883700 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnbtr\" (UniqueName: \"kubernetes.io/projected/ed73bd9e-c47b-450d-89f7-48927170ad3b-kube-api-access-tnbtr\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.884893 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883752 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ed73bd9e-c47b-450d-89f7-48927170ad3b-config-out\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.884893 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.883783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.884893 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.884758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.885694 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.885381 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.886587 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.886563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-config\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.887048 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.887012 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ed73bd9e-c47b-450d-89f7-48927170ad3b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.887048 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.887020 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.887203 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.887046 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.887203 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.887077 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.887203 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.887084 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.887709 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.887688 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.887791 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.887720 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.887868 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.887845 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ed73bd9e-c47b-450d-89f7-48927170ad3b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.888076 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.888057 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.888186 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.888172 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.889614 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.889588 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ed73bd9e-c47b-450d-89f7-48927170ad3b-config-out\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.889807 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.889791 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ed73bd9e-c47b-450d-89f7-48927170ad3b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.890014 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.889999 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-web-config\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.890141 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.890127 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ed73bd9e-c47b-450d-89f7-48927170ad3b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:29.895498 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:29.895477 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnbtr\" (UniqueName: \"kubernetes.io/projected/ed73bd9e-c47b-450d-89f7-48927170ad3b-kube-api-access-tnbtr\") pod \"prometheus-k8s-0\" (UID: \"ed73bd9e-c47b-450d-89f7-48927170ad3b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:30.065499 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:30.065393 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:30.198479 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:30.198436 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:30.200848 ip-10-0-131-194 kubenswrapper[2580]: W0422 20:00:30.200816 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded73bd9e_c47b_450d_89f7_48927170ad3b.slice/crio-8276340f16974ca7229e3edbda24cb85a8da0fcb73d86e093199f7313d948f86 WatchSource:0}: Error finding container 8276340f16974ca7229e3edbda24cb85a8da0fcb73d86e093199f7313d948f86: Status 404 returned error can't find the container with id 8276340f16974ca7229e3edbda24cb85a8da0fcb73d86e093199f7313d948f86 Apr 22 20:00:30.695770 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:30.695732 2580 generic.go:358] "Generic (PLEG): container finished" podID="ed73bd9e-c47b-450d-89f7-48927170ad3b" containerID="172e68c531901469c73a628ddf16df8feaab9c39fd29b050ca7b45e6509e6d35" exitCode=0 Apr 22 20:00:30.696207 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:30.695783 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed73bd9e-c47b-450d-89f7-48927170ad3b","Type":"ContainerDied","Data":"172e68c531901469c73a628ddf16df8feaab9c39fd29b050ca7b45e6509e6d35"} Apr 22 20:00:30.696207 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:30.695806 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed73bd9e-c47b-450d-89f7-48927170ad3b","Type":"ContainerStarted","Data":"8276340f16974ca7229e3edbda24cb85a8da0fcb73d86e093199f7313d948f86"} Apr 22 20:00:31.069524 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:31.069496 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd6b8d80-91b9-4b46-8aba-4bc5da59a846" path="/var/lib/kubelet/pods/cd6b8d80-91b9-4b46-8aba-4bc5da59a846/volumes" Apr 22 20:00:31.703887 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:31.703854 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed73bd9e-c47b-450d-89f7-48927170ad3b","Type":"ContainerStarted","Data":"df1a053e2d53f7d0b9c7048d7464a370eda0d9940a9dee5b2cccc70fd8a412bd"} Apr 22 20:00:31.703887 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:31.703890 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed73bd9e-c47b-450d-89f7-48927170ad3b","Type":"ContainerStarted","Data":"d2d39b6d3f416c8c3fbc9a0fce56cbb94420452c57680fb6f429ed9b4398999a"} Apr 22 20:00:31.704397 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:31.703908 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed73bd9e-c47b-450d-89f7-48927170ad3b","Type":"ContainerStarted","Data":"5302f10f6078293fdb6b1eac67d29f57c1e03e50ceeb46a22fbd50aea3ff2457"} Apr 22 20:00:31.704397 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:31.703919 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed73bd9e-c47b-450d-89f7-48927170ad3b","Type":"ContainerStarted","Data":"06dcd2da470c22b5fea24b315826e1d6a3eebc7806da4249cceb4daab9cd084a"} Apr 22 20:00:31.704397 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:31.703929 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed73bd9e-c47b-450d-89f7-48927170ad3b","Type":"ContainerStarted","Data":"e38c686c48aa80b19a78e37fa018814ea07dec708886e3f35253a96af6703962"} Apr 22 20:00:31.704397 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:31.703940 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ed73bd9e-c47b-450d-89f7-48927170ad3b","Type":"ContainerStarted","Data":"cffdf87d39bb1f9021325ba6daa42863f3fc3d7e03352cf42fb066d3e284f946"} Apr 22 20:00:31.731958 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:31.731897 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.731877789 podStartE2EDuration="2.731877789s" podCreationTimestamp="2026-04-22 20:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:31.730477963 +0000 UTC m=+153.249370997" watchObservedRunningTime="2026-04-22 20:00:31.731877789 +0000 UTC m=+153.250770826" Apr 22 20:00:35.067634 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:35.067605 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:55.823538 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:55.823451 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-48pmd"] Apr 22 20:00:55.826751 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:55.826728 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:55.829166 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:55.829140 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 20:00:55.833735 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:55.833711 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-48pmd"] Apr 22 20:00:55.914441 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:55.914399 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dada5cb0-848a-489c-9661-b42e69f239a3-original-pull-secret\") pod \"global-pull-secret-syncer-48pmd\" (UID: \"dada5cb0-848a-489c-9661-b42e69f239a3\") " pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:55.914640 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:55.914462 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dada5cb0-848a-489c-9661-b42e69f239a3-kubelet-config\") pod \"global-pull-secret-syncer-48pmd\" (UID: \"dada5cb0-848a-489c-9661-b42e69f239a3\") " pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:55.914640 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:55.914556 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dada5cb0-848a-489c-9661-b42e69f239a3-dbus\") pod \"global-pull-secret-syncer-48pmd\" (UID: \"dada5cb0-848a-489c-9661-b42e69f239a3\") " pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:56.015369 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:56.015333 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dada5cb0-848a-489c-9661-b42e69f239a3-kubelet-config\") pod \"global-pull-secret-syncer-48pmd\" (UID: \"dada5cb0-848a-489c-9661-b42e69f239a3\") " pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:56.015561 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:56.015400 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dada5cb0-848a-489c-9661-b42e69f239a3-dbus\") pod \"global-pull-secret-syncer-48pmd\" (UID: \"dada5cb0-848a-489c-9661-b42e69f239a3\") " pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:56.015561 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:56.015446 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dada5cb0-848a-489c-9661-b42e69f239a3-original-pull-secret\") pod \"global-pull-secret-syncer-48pmd\" (UID: \"dada5cb0-848a-489c-9661-b42e69f239a3\") " pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:56.015561 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:56.015472 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dada5cb0-848a-489c-9661-b42e69f239a3-kubelet-config\") pod \"global-pull-secret-syncer-48pmd\" (UID: \"dada5cb0-848a-489c-9661-b42e69f239a3\") " pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:56.015674 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:56.015601 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dada5cb0-848a-489c-9661-b42e69f239a3-dbus\") pod \"global-pull-secret-syncer-48pmd\" (UID: \"dada5cb0-848a-489c-9661-b42e69f239a3\") " pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:56.017956 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:56.017940 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dada5cb0-848a-489c-9661-b42e69f239a3-original-pull-secret\") pod \"global-pull-secret-syncer-48pmd\" (UID: \"dada5cb0-848a-489c-9661-b42e69f239a3\") " pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:56.137797 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:56.137714 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-48pmd" Apr 22 20:00:56.256551 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:56.256413 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-48pmd"] Apr 22 20:00:56.258767 ip-10-0-131-194 kubenswrapper[2580]: W0422 20:00:56.258736 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddada5cb0_848a_489c_9661_b42e69f239a3.slice/crio-63c8a144c241c79098195ce5bd9afe247ebc5c14abdfb3f4d3dd04ceccf41466 WatchSource:0}: Error finding container 63c8a144c241c79098195ce5bd9afe247ebc5c14abdfb3f4d3dd04ceccf41466: Status 404 returned error can't find the container with id 63c8a144c241c79098195ce5bd9afe247ebc5c14abdfb3f4d3dd04ceccf41466 Apr 22 20:00:56.782476 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:00:56.782434 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-48pmd" event={"ID":"dada5cb0-848a-489c-9661-b42e69f239a3","Type":"ContainerStarted","Data":"63c8a144c241c79098195ce5bd9afe247ebc5c14abdfb3f4d3dd04ceccf41466"} Apr 22 20:01:00.797146 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:01:00.797109 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-48pmd" event={"ID":"dada5cb0-848a-489c-9661-b42e69f239a3","Type":"ContainerStarted","Data":"4ef892bd67ac91e6bf7c317a72238c6a868a18c58c38cab33589011ac053de4e"} Apr 22 20:01:00.813846 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:01:00.813800 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-48pmd" podStartSLOduration=2.311430721 podStartE2EDuration="5.813786875s" podCreationTimestamp="2026-04-22 20:00:55 +0000 UTC" firstStartedPulling="2026-04-22 20:00:56.260417932 +0000 UTC m=+177.779310960" lastFinishedPulling="2026-04-22 20:00:59.762774093 +0000 UTC m=+181.281667114" observedRunningTime="2026-04-22 20:01:00.811945435 +0000 UTC m=+182.330838484" watchObservedRunningTime="2026-04-22 20:01:00.813786875 +0000 UTC m=+182.332679909" Apr 22 20:01:30.065817 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:01:30.065778 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:30.082036 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:01:30.082007 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:30.896995 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:01:30.896961 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:02:58.443485 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.443450 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-665c47d676-tcchb"] Apr 22 20:02:58.446155 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.446136 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:02:58.448640 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.448624 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 20:02:58.448721 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.448659 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:02:58.448721 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.448660 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-gz55x\"" Apr 22 20:02:58.448833 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.448738 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:02:58.453973 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.453949 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7"] Apr 22 20:02:58.456917 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.456896 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:02:58.457546 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.457529 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-tcchb"] Apr 22 20:02:58.459033 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.459014 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-5q2qn\"" Apr 22 20:02:58.459198 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.459183 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 20:02:58.466824 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.466805 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7"] Apr 22 20:02:58.474714 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.474694 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnnrb\" (UniqueName: \"kubernetes.io/projected/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-kube-api-access-rnnrb\") pod \"kserve-controller-manager-665c47d676-tcchb\" (UID: \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\") " pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:02:58.474817 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.474725 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ccfb02d-f067-4364-81ed-47a37cbd739c-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-7qvh7\" (UID: \"0ccfb02d-f067-4364-81ed-47a37cbd739c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:02:58.474817 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.474755 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92rg5\" (UniqueName: \"kubernetes.io/projected/0ccfb02d-f067-4364-81ed-47a37cbd739c-kube-api-access-92rg5\") pod \"llmisvc-controller-manager-68cc5db7c4-7qvh7\" (UID: \"0ccfb02d-f067-4364-81ed-47a37cbd739c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:02:58.474905 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.474818 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-cert\") pod \"kserve-controller-manager-665c47d676-tcchb\" (UID: \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\") " pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:02:58.575992 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.575962 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ccfb02d-f067-4364-81ed-47a37cbd739c-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-7qvh7\" (UID: \"0ccfb02d-f067-4364-81ed-47a37cbd739c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:02:58.576197 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.576008 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92rg5\" (UniqueName: \"kubernetes.io/projected/0ccfb02d-f067-4364-81ed-47a37cbd739c-kube-api-access-92rg5\") pod \"llmisvc-controller-manager-68cc5db7c4-7qvh7\" (UID: \"0ccfb02d-f067-4364-81ed-47a37cbd739c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:02:58.576197 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.576054 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-cert\") pod \"kserve-controller-manager-665c47d676-tcchb\" (UID: \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\") " pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:02:58.576197 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.576102 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnnrb\" (UniqueName: \"kubernetes.io/projected/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-kube-api-access-rnnrb\") pod \"kserve-controller-manager-665c47d676-tcchb\" (UID: \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\") " pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:02:58.578629 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.578603 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-cert\") pod \"kserve-controller-manager-665c47d676-tcchb\" (UID: \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\") " pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:02:58.578984 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.578967 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ccfb02d-f067-4364-81ed-47a37cbd739c-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-7qvh7\" (UID: \"0ccfb02d-f067-4364-81ed-47a37cbd739c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:02:58.585473 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.585450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnnrb\" (UniqueName: \"kubernetes.io/projected/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-kube-api-access-rnnrb\") pod \"kserve-controller-manager-665c47d676-tcchb\" (UID: \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\") " pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:02:58.585568 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.585529 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92rg5\" (UniqueName: \"kubernetes.io/projected/0ccfb02d-f067-4364-81ed-47a37cbd739c-kube-api-access-92rg5\") pod \"llmisvc-controller-manager-68cc5db7c4-7qvh7\" (UID: \"0ccfb02d-f067-4364-81ed-47a37cbd739c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:02:58.757591 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.757484 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:02:58.767386 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.767358 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:02:58.895564 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.895526 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-tcchb"] Apr 22 20:02:58.899937 ip-10-0-131-194 kubenswrapper[2580]: W0422 20:02:58.899909 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dc50a8c_02f0_423f_a2ad_bfcf14bef7bb.slice/crio-89f7fb00802a6db5329648af13c24a438c4df8eeb3bfeaec4c42652c830a43b6 WatchSource:0}: Error finding container 89f7fb00802a6db5329648af13c24a438c4df8eeb3bfeaec4c42652c830a43b6: Status 404 returned error can't find the container with id 89f7fb00802a6db5329648af13c24a438c4df8eeb3bfeaec4c42652c830a43b6 Apr 22 20:02:58.918297 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.918194 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7"] Apr 22 20:02:58.921152 ip-10-0-131-194 kubenswrapper[2580]: W0422 20:02:58.921123 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0ccfb02d_f067_4364_81ed_47a37cbd739c.slice/crio-1d934f71151383fc9afc83437c154b1c70ccb80fe6e2753365a06048972de0b4 WatchSource:0}: Error finding container 1d934f71151383fc9afc83437c154b1c70ccb80fe6e2753365a06048972de0b4: Status 404 returned error can't find the container with id 1d934f71151383fc9afc83437c154b1c70ccb80fe6e2753365a06048972de0b4 Apr 22 20:02:58.945298 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:58.945247 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 20:02:59.139022 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:59.138843 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-tcchb" event={"ID":"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb","Type":"ContainerStarted","Data":"89f7fb00802a6db5329648af13c24a438c4df8eeb3bfeaec4c42652c830a43b6"} Apr 22 20:02:59.148740 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:02:59.139921 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" event={"ID":"0ccfb02d-f067-4364-81ed-47a37cbd739c","Type":"ContainerStarted","Data":"1d934f71151383fc9afc83437c154b1c70ccb80fe6e2753365a06048972de0b4"} Apr 22 20:03:02.152999 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:02.152964 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" event={"ID":"0ccfb02d-f067-4364-81ed-47a37cbd739c","Type":"ContainerStarted","Data":"a59fbd4ab07543de7feeb265501931a2d29acfbf6769fd8c983013bc4cf7d26c"} Apr 22 20:03:02.153474 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:02.153096 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:03:02.191437 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:02.191373 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" podStartSLOduration=1.523027956 podStartE2EDuration="4.191356213s" podCreationTimestamp="2026-04-22 20:02:58 +0000 UTC" firstStartedPulling="2026-04-22 20:02:58.922630913 +0000 UTC m=+300.441523940" lastFinishedPulling="2026-04-22 20:03:01.590959178 +0000 UTC m=+303.109852197" observedRunningTime="2026-04-22 20:03:02.190000675 +0000 UTC m=+303.708893711" watchObservedRunningTime="2026-04-22 20:03:02.191356213 +0000 UTC m=+303.710249251" Apr 22 20:03:03.157515 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:03.157477 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-tcchb" event={"ID":"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb","Type":"ContainerStarted","Data":"a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23"} Apr 22 20:03:03.157987 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:03.157534 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:03:03.175741 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:03.175687 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-665c47d676-tcchb" podStartSLOduration=1.795253927 podStartE2EDuration="5.175672948s" podCreationTimestamp="2026-04-22 20:02:58 +0000 UTC" firstStartedPulling="2026-04-22 20:02:58.901153789 +0000 UTC m=+300.420046802" lastFinishedPulling="2026-04-22 20:03:02.28157281 +0000 UTC m=+303.800465823" observedRunningTime="2026-04-22 20:03:03.174174329 +0000 UTC m=+304.693067363" watchObservedRunningTime="2026-04-22 20:03:03.175672948 +0000 UTC m=+304.694565981" Apr 22 20:03:33.160360 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:33.160321 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-7qvh7" Apr 22 20:03:34.166181 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.166154 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:03:34.304687 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.304652 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-tcchb"] Apr 22 20:03:34.304918 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.304893 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-665c47d676-tcchb" podUID="2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb" containerName="manager" containerID="cri-o://a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23" gracePeriod=10 Apr 22 20:03:34.542816 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.542794 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:03:34.688920 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.688892 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-cert\") pod \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\" (UID: \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\") " Apr 22 20:03:34.689075 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.689009 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnnrb\" (UniqueName: \"kubernetes.io/projected/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-kube-api-access-rnnrb\") pod \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\" (UID: \"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb\") " Apr 22 20:03:34.691339 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.691303 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-cert" (OuterVolumeSpecName: "cert") pod "2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb" (UID: "2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:03:34.691443 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.691307 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-kube-api-access-rnnrb" (OuterVolumeSpecName: "kube-api-access-rnnrb") pod "2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb" (UID: "2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb"). InnerVolumeSpecName "kube-api-access-rnnrb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:03:34.790102 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.790067 2580 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-cert\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:03:34.790102 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:34.790096 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rnnrb\" (UniqueName: \"kubernetes.io/projected/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb-kube-api-access-rnnrb\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:03:35.250373 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:35.250341 2580 generic.go:358] "Generic (PLEG): container finished" podID="2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb" containerID="a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23" exitCode=0 Apr 22 20:03:35.250837 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:35.250399 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-tcchb" event={"ID":"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb","Type":"ContainerDied","Data":"a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23"} Apr 22 20:03:35.250837 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:35.250408 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-665c47d676-tcchb" Apr 22 20:03:35.250837 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:35.250427 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-665c47d676-tcchb" event={"ID":"2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb","Type":"ContainerDied","Data":"89f7fb00802a6db5329648af13c24a438c4df8eeb3bfeaec4c42652c830a43b6"} Apr 22 20:03:35.250837 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:35.250442 2580 scope.go:117] "RemoveContainer" containerID="a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23" Apr 22 20:03:35.258450 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:35.258437 2580 scope.go:117] "RemoveContainer" containerID="a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23" Apr 22 20:03:35.258709 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:03:35.258691 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23\": container with ID starting with a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23 not found: ID does not exist" containerID="a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23" Apr 22 20:03:35.258772 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:35.258720 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23"} err="failed to get container status \"a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23\": rpc error: code = NotFound desc = could not find container \"a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23\": container with ID starting with a012cacf74503b6a7cd974482025254ba7ce801931a6f436000d38dee99d7f23 not found: ID does not exist" Apr 22 20:03:35.266578 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:35.266556 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-tcchb"] Apr 22 20:03:35.269072 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:35.269053 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-665c47d676-tcchb"] Apr 22 20:03:37.068379 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:03:37.068348 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb" path="/var/lib/kubelet/pods/2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb/volumes" Apr 22 20:04:26.222348 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.222311 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-4mc26"] Apr 22 20:04:26.222809 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.222652 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb" containerName="manager" Apr 22 20:04:26.222809 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.222664 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb" containerName="manager" Apr 22 20:04:26.222809 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.222734 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2dc50a8c-02f0-423f-a2ad-bfcf14bef7bb" containerName="manager" Apr 22 20:04:26.224656 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.224637 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-4mc26" Apr 22 20:04:26.227268 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.227228 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-6xk8w\"" Apr 22 20:04:26.227479 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.227463 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 20:04:26.233430 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.233408 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-4mc26"] Apr 22 20:04:26.244147 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.244120 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75b84\" (UniqueName: \"kubernetes.io/projected/f29c876b-30f2-407e-8f4f-86cdab68ee88-kube-api-access-75b84\") pod \"s3-init-4mc26\" (UID: \"f29c876b-30f2-407e-8f4f-86cdab68ee88\") " pod="kserve/s3-init-4mc26" Apr 22 20:04:26.345345 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.345311 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75b84\" (UniqueName: \"kubernetes.io/projected/f29c876b-30f2-407e-8f4f-86cdab68ee88-kube-api-access-75b84\") pod \"s3-init-4mc26\" (UID: \"f29c876b-30f2-407e-8f4f-86cdab68ee88\") " pod="kserve/s3-init-4mc26" Apr 22 20:04:26.362444 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.362407 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75b84\" (UniqueName: \"kubernetes.io/projected/f29c876b-30f2-407e-8f4f-86cdab68ee88-kube-api-access-75b84\") pod \"s3-init-4mc26\" (UID: \"f29c876b-30f2-407e-8f4f-86cdab68ee88\") " pod="kserve/s3-init-4mc26" Apr 22 20:04:26.545624 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.545529 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-4mc26" Apr 22 20:04:26.670888 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.670857 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-4mc26"] Apr 22 20:04:26.675114 ip-10-0-131-194 kubenswrapper[2580]: W0422 20:04:26.675089 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf29c876b_30f2_407e_8f4f_86cdab68ee88.slice/crio-a2389376fce5ba6c2c582bd57c6c782d8887d1fd4ba38c7d1d99b0190bd08caf WatchSource:0}: Error finding container a2389376fce5ba6c2c582bd57c6c782d8887d1fd4ba38c7d1d99b0190bd08caf: Status 404 returned error can't find the container with id a2389376fce5ba6c2c582bd57c6c782d8887d1fd4ba38c7d1d99b0190bd08caf Apr 22 20:04:26.676966 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:26.676947 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:04:27.410070 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:27.410028 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-4mc26" event={"ID":"f29c876b-30f2-407e-8f4f-86cdab68ee88","Type":"ContainerStarted","Data":"a2389376fce5ba6c2c582bd57c6c782d8887d1fd4ba38c7d1d99b0190bd08caf"} Apr 22 20:04:32.429104 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:32.429068 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-4mc26" event={"ID":"f29c876b-30f2-407e-8f4f-86cdab68ee88","Type":"ContainerStarted","Data":"0012391092f21a26191f9fa71082dc43fdccd62a86196b5d3943782bbb705ba5"} Apr 22 20:04:32.462031 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:32.461970 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-4mc26" podStartSLOduration=1.236985046 podStartE2EDuration="6.461949838s" podCreationTimestamp="2026-04-22 20:04:26 +0000 UTC" firstStartedPulling="2026-04-22 20:04:26.677121096 +0000 UTC m=+388.196014112" lastFinishedPulling="2026-04-22 20:04:31.902085887 +0000 UTC m=+393.420978904" observedRunningTime="2026-04-22 20:04:32.459773066 +0000 UTC m=+393.978666100" watchObservedRunningTime="2026-04-22 20:04:32.461949838 +0000 UTC m=+393.980842874" Apr 22 20:04:35.440078 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:35.440046 2580 generic.go:358] "Generic (PLEG): container finished" podID="f29c876b-30f2-407e-8f4f-86cdab68ee88" containerID="0012391092f21a26191f9fa71082dc43fdccd62a86196b5d3943782bbb705ba5" exitCode=0 Apr 22 20:04:35.440504 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:35.440127 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-4mc26" event={"ID":"f29c876b-30f2-407e-8f4f-86cdab68ee88","Type":"ContainerDied","Data":"0012391092f21a26191f9fa71082dc43fdccd62a86196b5d3943782bbb705ba5"} Apr 22 20:04:36.571979 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:36.571953 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-4mc26" Apr 22 20:04:36.642864 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:36.642828 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75b84\" (UniqueName: \"kubernetes.io/projected/f29c876b-30f2-407e-8f4f-86cdab68ee88-kube-api-access-75b84\") pod \"f29c876b-30f2-407e-8f4f-86cdab68ee88\" (UID: \"f29c876b-30f2-407e-8f4f-86cdab68ee88\") " Apr 22 20:04:36.645138 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:36.645105 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29c876b-30f2-407e-8f4f-86cdab68ee88-kube-api-access-75b84" (OuterVolumeSpecName: "kube-api-access-75b84") pod "f29c876b-30f2-407e-8f4f-86cdab68ee88" (UID: "f29c876b-30f2-407e-8f4f-86cdab68ee88"). InnerVolumeSpecName "kube-api-access-75b84". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:04:36.743802 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:36.743704 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-75b84\" (UniqueName: \"kubernetes.io/projected/f29c876b-30f2-407e-8f4f-86cdab68ee88-kube-api-access-75b84\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:04:37.446240 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:37.446210 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-4mc26" Apr 22 20:04:37.446240 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:37.446217 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-4mc26" event={"ID":"f29c876b-30f2-407e-8f4f-86cdab68ee88","Type":"ContainerDied","Data":"a2389376fce5ba6c2c582bd57c6c782d8887d1fd4ba38c7d1d99b0190bd08caf"} Apr 22 20:04:37.446455 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:04:37.446267 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2389376fce5ba6c2c582bd57c6c782d8887d1fd4ba38c7d1d99b0190bd08caf" Apr 22 20:17:10.581733 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.581693 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kdhhw/must-gather-wfmfg"] Apr 22 20:17:10.582280 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.582125 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f29c876b-30f2-407e-8f4f-86cdab68ee88" containerName="s3-init" Apr 22 20:17:10.582280 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.582140 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29c876b-30f2-407e-8f4f-86cdab68ee88" containerName="s3-init" Apr 22 20:17:10.582280 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.582200 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="f29c876b-30f2-407e-8f4f-86cdab68ee88" containerName="s3-init" Apr 22 20:17:10.585351 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.585332 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:10.587791 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.587768 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kdhhw\"/\"kube-root-ca.crt\"" Apr 22 20:17:10.588759 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.588741 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kdhhw\"/\"openshift-service-ca.crt\"" Apr 22 20:17:10.588857 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.588782 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kdhhw\"/\"default-dockercfg-znl4d\"" Apr 22 20:17:10.591402 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.591382 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kdhhw/must-gather-wfmfg"] Apr 22 20:17:10.777406 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.777362 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-must-gather-output\") pod \"must-gather-wfmfg\" (UID: \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\") " pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:10.777589 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.777495 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxvp\" (UniqueName: \"kubernetes.io/projected/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-kube-api-access-gxxvp\") pod \"must-gather-wfmfg\" (UID: \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\") " pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:10.878290 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.878165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxvp\" (UniqueName: \"kubernetes.io/projected/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-kube-api-access-gxxvp\") pod \"must-gather-wfmfg\" (UID: \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\") " pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:10.878290 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.878231 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-must-gather-output\") pod \"must-gather-wfmfg\" (UID: \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\") " pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:10.878569 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.878551 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-must-gather-output\") pod \"must-gather-wfmfg\" (UID: \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\") " pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:10.890634 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.890597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxvp\" (UniqueName: \"kubernetes.io/projected/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-kube-api-access-gxxvp\") pod \"must-gather-wfmfg\" (UID: \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\") " pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:10.906754 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:10.906730 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:11.031960 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:11.031931 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kdhhw/must-gather-wfmfg"] Apr 22 20:17:11.034378 ip-10-0-131-194 kubenswrapper[2580]: W0422 20:17:11.034346 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9994eae9_d6c4_4d6e_a478_3c5b1658dd9b.slice/crio-513d23b66b94f9ccc4903d25da6351075bf6452dab806e57a338ced3a131ac39 WatchSource:0}: Error finding container 513d23b66b94f9ccc4903d25da6351075bf6452dab806e57a338ced3a131ac39: Status 404 returned error can't find the container with id 513d23b66b94f9ccc4903d25da6351075bf6452dab806e57a338ced3a131ac39 Apr 22 20:17:11.036168 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:11.036150 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:17:11.709398 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:11.709358 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" event={"ID":"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b","Type":"ContainerStarted","Data":"513d23b66b94f9ccc4903d25da6351075bf6452dab806e57a338ced3a131ac39"} Apr 22 20:17:16.727816 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:16.727781 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" event={"ID":"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b","Type":"ContainerStarted","Data":"42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c"} Apr 22 20:17:16.727816 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:16.727819 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" event={"ID":"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b","Type":"ContainerStarted","Data":"8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad"} Apr 22 20:17:16.743075 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:16.743008 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" podStartSLOduration=1.649094214 podStartE2EDuration="6.742987681s" podCreationTimestamp="2026-04-22 20:17:10 +0000 UTC" firstStartedPulling="2026-04-22 20:17:11.036301244 +0000 UTC m=+1152.555194257" lastFinishedPulling="2026-04-22 20:17:16.130194698 +0000 UTC m=+1157.649087724" observedRunningTime="2026-04-22 20:17:16.741777494 +0000 UTC m=+1158.260670533" watchObservedRunningTime="2026-04-22 20:17:16.742987681 +0000 UTC m=+1158.261880717" Apr 22 20:17:33.789632 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:33.789553 2580 generic.go:358] "Generic (PLEG): container finished" podID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" containerID="8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad" exitCode=0 Apr 22 20:17:33.789632 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:33.789602 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" event={"ID":"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b","Type":"ContainerDied","Data":"8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad"} Apr 22 20:17:33.790049 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:33.789934 2580 scope.go:117] "RemoveContainer" containerID="8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad" Apr 22 20:17:34.521714 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:34.521689 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdhhw_must-gather-wfmfg_9994eae9-d6c4-4d6e-a478-3c5b1658dd9b/gather/0.log" Apr 22 20:17:35.036078 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.036043 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mv695/must-gather-4mqnc"] Apr 22 20:17:35.038607 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.038586 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv695/must-gather-4mqnc" Apr 22 20:17:35.041176 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.041154 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mv695\"/\"openshift-service-ca.crt\"" Apr 22 20:17:35.041311 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.041154 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mv695\"/\"kube-root-ca.crt\"" Apr 22 20:17:35.042095 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.042071 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mv695\"/\"default-dockercfg-kkmqp\"" Apr 22 20:17:35.047339 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.047314 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv695/must-gather-4mqnc"] Apr 22 20:17:35.109078 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.109049 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdz4r\" (UniqueName: \"kubernetes.io/projected/86de52d2-3df1-4403-b72a-39f963660abb-kube-api-access-vdz4r\") pod \"must-gather-4mqnc\" (UID: \"86de52d2-3df1-4403-b72a-39f963660abb\") " pod="openshift-must-gather-mv695/must-gather-4mqnc" Apr 22 20:17:35.109245 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.109083 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86de52d2-3df1-4403-b72a-39f963660abb-must-gather-output\") pod \"must-gather-4mqnc\" (UID: \"86de52d2-3df1-4403-b72a-39f963660abb\") " pod="openshift-must-gather-mv695/must-gather-4mqnc" Apr 22 20:17:35.210458 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.210423 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdz4r\" (UniqueName: \"kubernetes.io/projected/86de52d2-3df1-4403-b72a-39f963660abb-kube-api-access-vdz4r\") pod \"must-gather-4mqnc\" (UID: \"86de52d2-3df1-4403-b72a-39f963660abb\") " pod="openshift-must-gather-mv695/must-gather-4mqnc" Apr 22 20:17:35.210458 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.210461 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86de52d2-3df1-4403-b72a-39f963660abb-must-gather-output\") pod \"must-gather-4mqnc\" (UID: \"86de52d2-3df1-4403-b72a-39f963660abb\") " pod="openshift-must-gather-mv695/must-gather-4mqnc" Apr 22 20:17:35.210861 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.210837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/86de52d2-3df1-4403-b72a-39f963660abb-must-gather-output\") pod \"must-gather-4mqnc\" (UID: \"86de52d2-3df1-4403-b72a-39f963660abb\") " pod="openshift-must-gather-mv695/must-gather-4mqnc" Apr 22 20:17:35.218519 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.218495 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdz4r\" (UniqueName: \"kubernetes.io/projected/86de52d2-3df1-4403-b72a-39f963660abb-kube-api-access-vdz4r\") pod \"must-gather-4mqnc\" (UID: \"86de52d2-3df1-4403-b72a-39f963660abb\") " pod="openshift-must-gather-mv695/must-gather-4mqnc" Apr 22 20:17:35.348563 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.348483 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv695/must-gather-4mqnc" Apr 22 20:17:35.470020 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.469998 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv695/must-gather-4mqnc"] Apr 22 20:17:35.472639 ip-10-0-131-194 kubenswrapper[2580]: W0422 20:17:35.472610 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86de52d2_3df1_4403_b72a_39f963660abb.slice/crio-21b39931e10bfa93b800f04ca13b681ffca18c6da10bd967d20e8705f72caa0e WatchSource:0}: Error finding container 21b39931e10bfa93b800f04ca13b681ffca18c6da10bd967d20e8705f72caa0e: Status 404 returned error can't find the container with id 21b39931e10bfa93b800f04ca13b681ffca18c6da10bd967d20e8705f72caa0e Apr 22 20:17:35.797003 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:35.796970 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv695/must-gather-4mqnc" event={"ID":"86de52d2-3df1-4403-b72a-39f963660abb","Type":"ContainerStarted","Data":"21b39931e10bfa93b800f04ca13b681ffca18c6da10bd967d20e8705f72caa0e"} Apr 22 20:17:36.803114 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:36.803020 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv695/must-gather-4mqnc" event={"ID":"86de52d2-3df1-4403-b72a-39f963660abb","Type":"ContainerStarted","Data":"02c5fd08379115c58d774f5cafeb25d26b1bcd6afecae99524daf76658831202"} Apr 22 20:17:36.803114 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:36.803075 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv695/must-gather-4mqnc" event={"ID":"86de52d2-3df1-4403-b72a-39f963660abb","Type":"ContainerStarted","Data":"60c821b5064a8e6f120b37309d5109db89e281c9ec48b44a682dbe85f04af243"} Apr 22 20:17:36.819389 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:36.819330 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mv695/must-gather-4mqnc" podStartSLOduration=1.04272268 podStartE2EDuration="1.819307609s" podCreationTimestamp="2026-04-22 20:17:35 +0000 UTC" firstStartedPulling="2026-04-22 20:17:35.474618574 +0000 UTC m=+1176.993511588" lastFinishedPulling="2026-04-22 20:17:36.251203502 +0000 UTC m=+1177.770096517" observedRunningTime="2026-04-22 20:17:36.81765672 +0000 UTC m=+1178.336549762" watchObservedRunningTime="2026-04-22 20:17:36.819307609 +0000 UTC m=+1178.338200644" Apr 22 20:17:37.619213 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:37.619188 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-48pmd_dada5cb0-848a-489c-9661-b42e69f239a3/global-pull-secret-syncer/0.log" Apr 22 20:17:37.925069 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:37.924999 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q4kgc_f4ecbe0e-65f8-404d-a158-29f98b2705f1/konnectivity-agent/0.log" Apr 22 20:17:38.044111 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:38.044085 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-194.ec2.internal_d94f1ae0823126e44c89a34cb4f19534/haproxy/0.log" Apr 22 20:17:39.880906 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:39.880851 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kdhhw/must-gather-wfmfg"] Apr 22 20:17:39.881540 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:39.881500 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" containerName="copy" containerID="cri-o://42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c" gracePeriod=2 Apr 22 20:17:39.884232 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:39.884182 2580 status_manager.go:895] "Failed to get status for pod" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" err="pods \"must-gather-wfmfg\" is forbidden: User \"system:node:ip-10-0-131-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kdhhw\": no relationship found between node 'ip-10-0-131-194.ec2.internal' and this object" Apr 22 20:17:39.886165 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:39.886140 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kdhhw/must-gather-wfmfg"] Apr 22 20:17:40.282239 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.281813 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdhhw_must-gather-wfmfg_9994eae9-d6c4-4d6e-a478-3c5b1658dd9b/copy/0.log" Apr 22 20:17:40.282446 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.282413 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:40.284601 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.284550 2580 status_manager.go:895] "Failed to get status for pod" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" err="pods \"must-gather-wfmfg\" is forbidden: User \"system:node:ip-10-0-131-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kdhhw\": no relationship found between node 'ip-10-0-131-194.ec2.internal' and this object" Apr 22 20:17:40.371134 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.369416 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxxvp\" (UniqueName: \"kubernetes.io/projected/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-kube-api-access-gxxvp\") pod \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\" (UID: \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\") " Apr 22 20:17:40.371134 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.369473 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-must-gather-output\") pod \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\" (UID: \"9994eae9-d6c4-4d6e-a478-3c5b1658dd9b\") " Apr 22 20:17:40.371134 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.370820 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" (UID: "9994eae9-d6c4-4d6e-a478-3c5b1658dd9b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:17:40.377734 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.377685 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-kube-api-access-gxxvp" (OuterVolumeSpecName: "kube-api-access-gxxvp") pod "9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" (UID: "9994eae9-d6c4-4d6e-a478-3c5b1658dd9b"). InnerVolumeSpecName "kube-api-access-gxxvp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:17:40.470646 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.470563 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxxvp\" (UniqueName: \"kubernetes.io/projected/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-kube-api-access-gxxvp\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:17:40.470646 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.470603 2580 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b-must-gather-output\") on node \"ip-10-0-131-194.ec2.internal\" DevicePath \"\"" Apr 22 20:17:40.823658 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.823515 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kdhhw_must-gather-wfmfg_9994eae9-d6c4-4d6e-a478-3c5b1658dd9b/copy/0.log" Apr 22 20:17:40.827283 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.824334 2580 generic.go:358] "Generic (PLEG): container finished" podID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" containerID="42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c" exitCode=143 Apr 22 20:17:40.827283 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.824439 2580 scope.go:117] "RemoveContainer" containerID="42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c" Apr 22 20:17:40.827283 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.824579 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" Apr 22 20:17:40.832327 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.832287 2580 status_manager.go:895] "Failed to get status for pod" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" err="pods \"must-gather-wfmfg\" is forbidden: User \"system:node:ip-10-0-131-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kdhhw\": no relationship found between node 'ip-10-0-131-194.ec2.internal' and this object" Apr 22 20:17:40.848575 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.848527 2580 status_manager.go:895] "Failed to get status for pod" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" pod="openshift-must-gather-kdhhw/must-gather-wfmfg" err="pods \"must-gather-wfmfg\" is forbidden: User \"system:node:ip-10-0-131-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kdhhw\": no relationship found between node 'ip-10-0-131-194.ec2.internal' and this object" Apr 22 20:17:40.849083 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.849061 2580 scope.go:117] "RemoveContainer" containerID="8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad" Apr 22 20:17:40.876347 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.874324 2580 scope.go:117] "RemoveContainer" containerID="42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c" Apr 22 20:17:40.876347 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:17:40.874691 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c\": container with ID starting with 42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c not found: ID does not exist" containerID="42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c" Apr 22 20:17:40.876347 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.874723 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c"} err="failed to get container status \"42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c\": rpc error: code = NotFound desc = could not find container \"42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c\": container with ID starting with 42e5187e791f7c0770896c4e77aea0dde74a02e11de1929ed3ae75ac0297c98c not found: ID does not exist" Apr 22 20:17:40.876347 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.874751 2580 scope.go:117] "RemoveContainer" containerID="8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad" Apr 22 20:17:40.876347 ip-10-0-131-194 kubenswrapper[2580]: E0422 20:17:40.875050 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad\": container with ID starting with 8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad not found: ID does not exist" containerID="8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad" Apr 22 20:17:40.876347 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:40.875073 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad"} err="failed to get container status \"8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad\": rpc error: code = NotFound desc = could not find container \"8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad\": container with ID starting with 8a7aabe827d9947250837ee408f139f4b9c6dbea5768cb73c922efa8f5b143ad not found: ID does not exist" Apr 22 20:17:41.070532 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.070492 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" path="/var/lib/kubelet/pods/9994eae9-d6c4-4d6e-a478-3c5b1658dd9b/volumes" Apr 22 20:17:41.369047 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.369015 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ac443213-2dbe-46bd-8c27-4653e0c88871/alertmanager/0.log" Apr 22 20:17:41.394166 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.394143 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ac443213-2dbe-46bd-8c27-4653e0c88871/config-reloader/0.log" Apr 22 20:17:41.419330 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.419301 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ac443213-2dbe-46bd-8c27-4653e0c88871/kube-rbac-proxy-web/0.log" Apr 22 20:17:41.443597 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.443482 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ac443213-2dbe-46bd-8c27-4653e0c88871/kube-rbac-proxy/0.log" Apr 22 20:17:41.468748 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.468716 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ac443213-2dbe-46bd-8c27-4653e0c88871/kube-rbac-proxy-metric/0.log" Apr 22 20:17:41.492790 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.492751 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ac443213-2dbe-46bd-8c27-4653e0c88871/prom-label-proxy/0.log" Apr 22 20:17:41.521051 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.521021 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ac443213-2dbe-46bd-8c27-4653e0c88871/init-config-reloader/0.log" Apr 22 20:17:41.581101 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.581064 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-56gsc_1df16191-fb27-4b2c-b54d-efc9ceebda35/kube-state-metrics/0.log" Apr 22 20:17:41.602592 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.602553 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-56gsc_1df16191-fb27-4b2c-b54d-efc9ceebda35/kube-rbac-proxy-main/0.log" Apr 22 20:17:41.626088 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.626009 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-56gsc_1df16191-fb27-4b2c-b54d-efc9ceebda35/kube-rbac-proxy-self/0.log" Apr 22 20:17:41.661188 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.661158 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5548c4d88b-9bwd8_cc2f49c1-8615-4b4d-af75-4ba188e460f0/metrics-server/0.log" Apr 22 20:17:41.686662 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.686628 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-rt8sg_3aae5d5c-3e3d-41f7-8936-de19245f2588/monitoring-plugin/0.log" Apr 22 20:17:41.787274 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.787226 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8cpr_3512a00a-a9e9-4a62-b382-f679dbdd1b67/node-exporter/0.log" Apr 22 20:17:41.809066 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.809042 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8cpr_3512a00a-a9e9-4a62-b382-f679dbdd1b67/kube-rbac-proxy/0.log" Apr 22 20:17:41.828986 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.828958 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k8cpr_3512a00a-a9e9-4a62-b382-f679dbdd1b67/init-textfile/0.log" Apr 22 20:17:41.922846 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.922760 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-ffj7j_294d25c8-e955-4e1e-a99e-5c4f1130d221/kube-rbac-proxy-main/0.log" Apr 22 20:17:41.942511 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.942469 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-ffj7j_294d25c8-e955-4e1e-a99e-5c4f1130d221/kube-rbac-proxy-self/0.log" Apr 22 20:17:41.967115 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:41.967081 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-ffj7j_294d25c8-e955-4e1e-a99e-5c4f1130d221/openshift-state-metrics/0.log" Apr 22 20:17:42.010278 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.010222 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed73bd9e-c47b-450d-89f7-48927170ad3b/prometheus/0.log" Apr 22 20:17:42.030708 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.030677 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed73bd9e-c47b-450d-89f7-48927170ad3b/config-reloader/0.log" Apr 22 20:17:42.053195 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.053167 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed73bd9e-c47b-450d-89f7-48927170ad3b/thanos-sidecar/0.log" Apr 22 20:17:42.076556 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.076524 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed73bd9e-c47b-450d-89f7-48927170ad3b/kube-rbac-proxy-web/0.log" Apr 22 20:17:42.098075 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.098041 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed73bd9e-c47b-450d-89f7-48927170ad3b/kube-rbac-proxy/0.log" Apr 22 20:17:42.121374 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.121333 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed73bd9e-c47b-450d-89f7-48927170ad3b/kube-rbac-proxy-thanos/0.log" Apr 22 20:17:42.143435 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.143405 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_ed73bd9e-c47b-450d-89f7-48927170ad3b/init-config-reloader/0.log" Apr 22 20:17:42.174419 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.174351 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fh7v7_5a60d5a0-fc39-4655-8c92-ad8816d682db/prometheus-operator/0.log" Apr 22 20:17:42.191874 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.191845 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-fh7v7_5a60d5a0-fc39-4655-8c92-ad8816d682db/kube-rbac-proxy/0.log" Apr 22 20:17:42.215611 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.215581 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-kfrl6_1f2b7cd9-b3ea-4b04-9e45-1d2ff400e912/prometheus-operator-admission-webhook/0.log" Apr 22 20:17:42.241588 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.241553 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5ffdb9456-lf64t_46272374-b555-4cc1-bf00-d09835122abb/telemeter-client/0.log" Apr 22 20:17:42.264813 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.264780 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5ffdb9456-lf64t_46272374-b555-4cc1-bf00-d09835122abb/reload/0.log" Apr 22 20:17:42.285360 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.285335 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5ffdb9456-lf64t_46272374-b555-4cc1-bf00-d09835122abb/kube-rbac-proxy/0.log" Apr 22 20:17:42.315177 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.315140 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4fdb9788-4hxj7_3a405cd9-a1f0-440e-a6a7-dbcea8867edb/thanos-query/0.log" Apr 22 20:17:42.337278 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.337221 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4fdb9788-4hxj7_3a405cd9-a1f0-440e-a6a7-dbcea8867edb/kube-rbac-proxy-web/0.log" Apr 22 20:17:42.362860 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.362831 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4fdb9788-4hxj7_3a405cd9-a1f0-440e-a6a7-dbcea8867edb/kube-rbac-proxy/0.log" Apr 22 20:17:42.384088 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.384056 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4fdb9788-4hxj7_3a405cd9-a1f0-440e-a6a7-dbcea8867edb/prom-label-proxy/0.log" Apr 22 20:17:42.404300 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.404271 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4fdb9788-4hxj7_3a405cd9-a1f0-440e-a6a7-dbcea8867edb/kube-rbac-proxy-rules/0.log" Apr 22 20:17:42.424358 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:42.424318 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4fdb9788-4hxj7_3a405cd9-a1f0-440e-a6a7-dbcea8867edb/kube-rbac-proxy-metrics/0.log" Apr 22 20:17:44.973827 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:44.973789 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7"] Apr 22 20:17:44.974314 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:44.974200 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" containerName="copy" Apr 22 20:17:44.974314 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:44.974216 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" containerName="copy" Apr 22 20:17:44.974314 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:44.974268 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" containerName="gather" Apr 22 20:17:44.974314 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:44.974277 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" containerName="gather" Apr 22 20:17:44.974484 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:44.974369 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" containerName="gather" Apr 22 20:17:44.974484 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:44.974386 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="9994eae9-d6c4-4d6e-a478-3c5b1658dd9b" containerName="copy" Apr 22 20:17:44.977722 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:44.977700 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:44.988681 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:44.988657 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7"] Apr 22 20:17:45.120287 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.120226 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-proc\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.120483 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.120323 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-lib-modules\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.120483 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.120368 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtj52\" (UniqueName: \"kubernetes.io/projected/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-kube-api-access-wtj52\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.120483 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.120452 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-sys\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.120659 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.120550 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-podres\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.221628 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.221596 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-sys\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.221813 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.221665 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-podres\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.221813 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.221704 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-proc\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.221813 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.221732 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-sys\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.221813 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.221744 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-lib-modules\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.221813 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.221806 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-proc\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.222088 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.221810 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtj52\" (UniqueName: \"kubernetes.io/projected/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-kube-api-access-wtj52\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.222088 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.221884 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-podres\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.222088 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.222081 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-lib-modules\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.231014 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.230950 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtj52\" (UniqueName: \"kubernetes.io/projected/bc88b00d-dfe9-4b34-b4e3-e7c647e5db05-kube-api-access-wtj52\") pod \"perf-node-gather-daemonset-dvxj7\" (UID: \"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.291076 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.291043 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.437323 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.437100 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7"] Apr 22 20:17:45.442498 ip-10-0-131-194 kubenswrapper[2580]: W0422 20:17:45.442470 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbc88b00d_dfe9_4b34_b4e3_e7c647e5db05.slice/crio-4fdfcaaf00c8466b1e2c4f1fd5c00969ba1e88c35770d98377493f72869b7bc1 WatchSource:0}: Error finding container 4fdfcaaf00c8466b1e2c4f1fd5c00969ba1e88c35770d98377493f72869b7bc1: Status 404 returned error can't find the container with id 4fdfcaaf00c8466b1e2c4f1fd5c00969ba1e88c35770d98377493f72869b7bc1 Apr 22 20:17:45.480266 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.480225 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gh454_ffd7b823-9cf8-4e86-ac80-22981e293e06/dns/0.log" Apr 22 20:17:45.499564 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.499544 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gh454_ffd7b823-9cf8-4e86-ac80-22981e293e06/kube-rbac-proxy/0.log" Apr 22 20:17:45.607358 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.607330 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2ss6g_0df19ae2-f6f9-432d-bbda-8015b4504723/dns-node-resolver/0.log" Apr 22 20:17:45.846280 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.846179 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" event={"ID":"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05","Type":"ContainerStarted","Data":"2ef9349d1beefc362bafce525d5bc03252a2ebb8b60077bcc40298dab50f6e39"} Apr 22 20:17:45.846280 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.846219 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" event={"ID":"bc88b00d-dfe9-4b34-b4e3-e7c647e5db05","Type":"ContainerStarted","Data":"4fdfcaaf00c8466b1e2c4f1fd5c00969ba1e88c35770d98377493f72869b7bc1"} Apr 22 20:17:45.846479 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.846298 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:45.864025 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:45.863967 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" podStartSLOduration=1.8639497839999999 podStartE2EDuration="1.863949784s" podCreationTimestamp="2026-04-22 20:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:17:45.862596224 +0000 UTC m=+1187.381489271" watchObservedRunningTime="2026-04-22 20:17:45.863949784 +0000 UTC m=+1187.382842820" Apr 22 20:17:46.031328 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:46.031297 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-46mjj_1296444e-df43-4223-beb7-c3de3946d7a7/node-ca/0.log" Apr 22 20:17:47.116223 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:47.116191 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wvnxp_ca734b69-52bc-4ae7-9171-1860a1388b9f/serve-healthcheck-canary/0.log" Apr 22 20:17:47.478729 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:47.478697 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9w8kc_c16ab73e-ca33-4cd7-bef1-c05fb49576cf/kube-rbac-proxy/0.log" Apr 22 20:17:47.497150 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:47.497124 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9w8kc_c16ab73e-ca33-4cd7-bef1-c05fb49576cf/exporter/0.log" Apr 22 20:17:47.517219 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:47.517188 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9w8kc_c16ab73e-ca33-4cd7-bef1-c05fb49576cf/extractor/0.log" Apr 22 20:17:49.590022 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:49.589995 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-7qvh7_0ccfb02d-f067-4364-81ed-47a37cbd739c/manager/0.log" Apr 22 20:17:49.704527 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:49.704498 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-4mc26_f29c876b-30f2-407e-8f4f-86cdab68ee88/s3-init/0.log" Apr 22 20:17:51.863191 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:51.862318 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-dvxj7" Apr 22 20:17:54.815088 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:54.814977 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rsqjt_41c7e8c9-1a30-478c-8609-17a08d4db06c/kube-multus-additional-cni-plugins/0.log" Apr 22 20:17:54.835473 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:54.835437 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rsqjt_41c7e8c9-1a30-478c-8609-17a08d4db06c/egress-router-binary-copy/0.log" Apr 22 20:17:54.861837 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:54.861816 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rsqjt_41c7e8c9-1a30-478c-8609-17a08d4db06c/cni-plugins/0.log" Apr 22 20:17:54.882140 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:54.882116 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rsqjt_41c7e8c9-1a30-478c-8609-17a08d4db06c/bond-cni-plugin/0.log" Apr 22 20:17:54.900549 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:54.900527 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rsqjt_41c7e8c9-1a30-478c-8609-17a08d4db06c/routeoverride-cni/0.log" Apr 22 20:17:54.920145 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:54.920088 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rsqjt_41c7e8c9-1a30-478c-8609-17a08d4db06c/whereabouts-cni-bincopy/0.log" Apr 22 20:17:54.939241 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:54.939218 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rsqjt_41c7e8c9-1a30-478c-8609-17a08d4db06c/whereabouts-cni/0.log" Apr 22 20:17:55.116090 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:55.116060 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bmfgl_e460f2a4-02bb-4b8c-9775-8e03b6c0e88e/kube-multus/0.log" Apr 22 20:17:55.188647 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:55.188620 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fjgnl_9641a5d7-3e56-4f40-97db-ff0e3d5cb321/network-metrics-daemon/0.log" Apr 22 20:17:55.208384 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:55.208358 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fjgnl_9641a5d7-3e56-4f40-97db-ff0e3d5cb321/kube-rbac-proxy/0.log" Apr 22 20:17:56.585749 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:56.585656 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxp4m_13033536-961c-41e0-a8b1-73ef9eb5c983/ovn-controller/0.log" Apr 22 20:17:56.612325 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:56.612299 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxp4m_13033536-961c-41e0-a8b1-73ef9eb5c983/ovn-acl-logging/0.log" Apr 22 20:17:56.630881 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:56.630859 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxp4m_13033536-961c-41e0-a8b1-73ef9eb5c983/kube-rbac-proxy-node/0.log" Apr 22 20:17:56.651030 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:56.651008 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxp4m_13033536-961c-41e0-a8b1-73ef9eb5c983/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:17:56.670387 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:56.670363 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxp4m_13033536-961c-41e0-a8b1-73ef9eb5c983/northd/0.log" Apr 22 20:17:56.688204 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:56.688184 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxp4m_13033536-961c-41e0-a8b1-73ef9eb5c983/nbdb/0.log" Apr 22 20:17:56.707343 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:56.707323 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxp4m_13033536-961c-41e0-a8b1-73ef9eb5c983/sbdb/0.log" Apr 22 20:17:56.900602 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:56.900531 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wxp4m_13033536-961c-41e0-a8b1-73ef9eb5c983/ovnkube-controller/0.log" Apr 22 20:17:57.847907 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:57.847877 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-l4pll_0eaeb73f-d4a2-4a3a-8997-fd78247676aa/network-check-target-container/0.log" Apr 22 20:17:58.835843 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:58.835792 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-z79rl_b1bb13ae-9672-47e1-89b9-7a095040d199/iptables-alerter/0.log" Apr 22 20:17:59.437575 ip-10-0-131-194 kubenswrapper[2580]: I0422 20:17:59.437544 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6gkjl_ece6b521-94c5-4509-8a90-439f6a926c6b/tuned/0.log"