Apr 20 11:41:53.774566 ip-10-0-133-125 systemd[1]: Starting Kubernetes Kubelet... Apr 20 11:41:54.130197 ip-10-0-133-125 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 11:41:54.130197 ip-10-0-133-125 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 11:41:54.130197 ip-10-0-133-125 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 11:41:54.130197 ip-10-0-133-125 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 11:41:54.130197 ip-10-0-133-125 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 11:41:54.132109 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.132013 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 11:41:54.134419 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134403 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:54.134419 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134419 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134422 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134426 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134429 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134431 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134434 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134437 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134440 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134442 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134445 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134447 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134450 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134453 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134455 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134458 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134461 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134463 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134466 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134469 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:54.134479 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134472 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134476 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134481 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134484 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134487 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134490 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134493 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134496 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134498 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134501 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134503 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134506 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134508 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134511 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134513 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134516 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134518 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134521 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134523 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:54.135018 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134525 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134528 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134531 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134534 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134537 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134540 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134542 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134544 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134549 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134552 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134554 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134557 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134560 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134562 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134565 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134569 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134572 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134574 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134577 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:54.135499 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134580 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134583 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134586 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134588 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134591 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134593 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134596 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134598 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134601 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134604 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134607 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134610 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134612 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134615 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134618 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134620 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134623 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134625 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134628 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134631 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:54.135996 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134634 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134636 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134639 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134641 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134644 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134646 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134649 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.134652 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135043 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135048 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135051 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135054 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135056 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135059 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135062 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135065 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135069 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135072 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135074 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:54.136509 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135077 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135080 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135083 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135086 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135089 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135091 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135094 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135096 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135099 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135101 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135104 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135106 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135109 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135111 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135113 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135117 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135119 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135121 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135124 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135126 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135130 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:54.136958 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135133 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135136 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135138 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135141 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135143 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135146 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135148 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135151 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135155 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135158 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135161 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135164 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135167 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135170 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135172 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135175 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135177 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135180 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135183 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135185 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:54.137489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135190 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135192 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135195 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135197 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135200 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135202 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135205 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135207 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135210 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135212 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135214 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135217 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135220 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135222 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135225 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135227 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135229 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135232 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135234 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:54.137973 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135251 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135254 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135257 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135259 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135261 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135264 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135267 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135270 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135272 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135275 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135277 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135280 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135283 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135288 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.135291 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135882 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135891 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135902 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135907 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135912 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135915 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 11:41:54.138486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135919 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135924 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135927 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135930 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135934 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135937 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135940 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135944 2578 flags.go:64] FLAG: --cgroup-root="" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135947 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135950 2578 flags.go:64] FLAG: --client-ca-file="" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135952 2578 flags.go:64] FLAG: --cloud-config="" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135955 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135958 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135965 2578 flags.go:64] FLAG: --cluster-domain="" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135967 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135971 2578 flags.go:64] FLAG: --config-dir="" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135973 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135977 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135981 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135991 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135995 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.135998 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136001 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136005 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 11:41:54.138993 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136009 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136012 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136015 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136020 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136023 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136026 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136028 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136031 2578 flags.go:64] FLAG: --enable-server="true" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136043 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136049 2578 flags.go:64] FLAG: --event-burst="100" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136052 2578 flags.go:64] FLAG: --event-qps="50" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136055 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136059 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136062 2578 flags.go:64] FLAG: --eviction-hard="" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136066 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136069 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136072 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136076 2578 flags.go:64] FLAG: --eviction-soft="" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136079 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136081 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136084 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136087 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136090 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136093 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136095 2578 flags.go:64] FLAG: --feature-gates="" Apr 20 11:41:54.139574 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136100 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136103 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136106 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136110 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136114 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136117 2578 flags.go:64] FLAG: --help="false" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136119 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136123 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136126 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136129 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136132 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136135 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136138 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136141 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136144 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136147 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136150 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136153 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136156 2578 flags.go:64] FLAG: --kube-reserved="" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136159 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136162 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136165 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136168 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136171 2578 flags.go:64] FLAG: --lock-file="" Apr 20 11:41:54.140236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136173 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136176 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136179 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136184 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136187 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136190 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136193 2578 flags.go:64] FLAG: --logging-format="text" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136195 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136198 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136202 2578 flags.go:64] FLAG: --manifest-url="" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136204 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136209 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136212 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136216 2578 flags.go:64] FLAG: --max-pods="110" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136219 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136222 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136225 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136228 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136231 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136233 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136236 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136262 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136265 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136268 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 11:41:54.140846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136271 2578 flags.go:64] FLAG: --pod-cidr="" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136274 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136279 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136282 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136286 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136289 2578 flags.go:64] FLAG: --port="10250" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136292 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136295 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ddb777ccc579c8f2" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136298 2578 flags.go:64] FLAG: --qos-reserved="" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136301 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136304 2578 flags.go:64] FLAG: --register-node="true" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136307 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136310 2578 flags.go:64] FLAG: --register-with-taints="" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136314 2578 flags.go:64] FLAG: --registry-burst="10" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136317 2578 flags.go:64] FLAG: --registry-qps="5" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136320 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136322 2578 flags.go:64] FLAG: --reserved-memory="" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136326 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136329 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136332 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136335 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136338 2578 flags.go:64] FLAG: --runonce="false" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136342 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136345 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136348 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 20 11:41:54.141432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136351 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136353 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136357 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136360 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136363 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136366 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136369 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136372 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136374 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136378 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136381 2578 flags.go:64] FLAG: --system-cgroups="" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136384 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136389 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136392 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136395 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136400 2578 flags.go:64] FLAG: --tls-min-version="" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136403 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136405 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136408 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136411 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136414 2578 flags.go:64] FLAG: --v="2" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136418 2578 flags.go:64] FLAG: --version="false" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136427 2578 flags.go:64] FLAG: --vmodule="" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136432 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.136435 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 11:41:54.142023 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136534 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136537 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136540 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136543 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136547 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136552 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136555 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136558 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136560 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136563 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136566 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136568 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136571 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136573 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136576 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136579 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136581 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136584 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136586 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136589 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:54.142660 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136591 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136594 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136596 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136599 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136601 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136605 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136609 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136612 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136614 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136617 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136619 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136623 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136626 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136629 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136631 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136633 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136636 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136640 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136643 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:54.143149 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136646 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136648 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136651 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136656 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136659 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136662 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136664 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136667 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136670 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136672 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136675 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136678 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136680 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136683 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136685 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136688 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136691 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136693 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136697 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:54.143664 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136699 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136702 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136704 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136707 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136709 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136712 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136714 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136717 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136719 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136721 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136724 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136729 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136731 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136734 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136736 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136739 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136742 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136745 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136747 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136750 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:54.144128 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136752 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136755 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136757 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136760 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136762 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136765 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136767 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.136769 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.137269 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.143704 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.143720 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143771 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143777 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143780 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143783 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143786 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:54.144649 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143789 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143792 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143795 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143798 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143801 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143804 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143806 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143814 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143817 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143820 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143823 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143825 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143828 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143830 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143833 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143836 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143838 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143841 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143843 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143846 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:54.145048 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143848 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143850 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143853 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143855 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143858 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143860 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143863 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143867 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143869 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143872 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143875 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143877 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143880 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143882 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143885 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143887 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143890 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143892 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143894 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143897 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:54.145538 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143900 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143903 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143905 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143908 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143912 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143916 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143919 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143922 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143925 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143928 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143931 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143934 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143936 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143939 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143941 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143944 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143947 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143949 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143953 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:54.146008 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143956 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143958 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143961 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143963 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143966 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143968 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143971 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143973 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143976 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143978 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143981 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143983 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143986 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143988 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143991 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143996 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.143999 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144001 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144004 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144007 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:54.146477 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144011 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144015 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.144020 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144142 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144147 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144151 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144153 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144156 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144159 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144162 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144165 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144168 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144171 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144174 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144177 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 11:41:54.146952 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144179 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144182 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144184 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144187 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144189 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144192 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144194 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144197 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144199 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144202 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144204 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144207 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144209 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144213 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144215 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144218 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144220 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144223 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144226 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144228 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 11:41:54.147432 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144231 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144233 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144236 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144258 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144261 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144263 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144266 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144270 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144274 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144276 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144279 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144281 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144284 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144286 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144289 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144291 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144294 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144296 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144300 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 11:41:54.147947 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144303 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144306 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144309 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144311 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144314 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144316 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144319 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144322 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144324 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144327 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144330 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144332 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144335 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144338 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144341 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144343 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144345 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144348 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144350 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144352 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 11:41:54.148505 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144355 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144357 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144360 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144362 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144365 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144367 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144370 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144372 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144374 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144377 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144379 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144382 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144384 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144386 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:54.144389 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 11:41:54.148977 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.144394 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 11:41:54.149370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.144983 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 11:41:54.149370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.146927 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 11:41:54.149370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.147777 2578 server.go:1019] "Starting client certificate rotation" Apr 20 11:41:54.149370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.147886 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 11:41:54.149370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.148736 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 11:41:54.168347 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.168327 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 11:41:54.172692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.172664 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 11:41:54.185643 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.185622 2578 log.go:25] "Validated CRI v1 runtime API" Apr 20 11:41:54.190733 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.190715 2578 log.go:25] "Validated CRI v1 image API" Apr 20 11:41:54.192844 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.192210 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 11:41:54.195577 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.195551 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a45d1f11-fb80-4d84-8543-63e89392d420:/dev/nvme0n1p3 b3dd301f-6aa6-4c73-8e35-a96d947486f0:/dev/nvme0n1p4] Apr 20 11:41:54.195653 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.195577 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 11:41:54.199006 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.198985 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 11:41:54.201715 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.201602 2578 manager.go:217] Machine: {Timestamp:2026-04-20 11:41:54.200012582 +0000 UTC m=+0.323431820 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099829 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28f92e3016bed20f126872c4c3c744 SystemUUID:ec28f92e-3016-bed2-0f12-6872c4c3c744 BootID:15f79688-dbe5-4452-ab2c-4e72d40447a3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:15:19:b5:a0:f1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:15:19:b5:a0:f1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:97:9c:50:3e:ee Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 11:41:54.202211 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.202201 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 11:41:54.202334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.202316 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 11:41:54.203859 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.203837 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 11:41:54.204064 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.203862 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-125.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 11:41:54.204110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.204077 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 11:41:54.204110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.204086 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 11:41:54.204110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.204099 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 11:41:54.204650 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.204640 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 11:41:54.205466 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.205455 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 20 11:41:54.205595 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.205586 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 11:41:54.208263 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.208252 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 20 11:41:54.208314 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.208268 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 11:41:54.208314 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.208285 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 11:41:54.208314 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.208295 2578 kubelet.go:397] "Adding apiserver pod source" Apr 20 11:41:54.208314 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.208314 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 11:41:54.209380 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.209367 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 11:41:54.209439 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.209385 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 11:41:54.212303 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.212283 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 11:41:54.213534 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.213516 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 11:41:54.215853 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215825 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 11:41:54.215853 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215854 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215865 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215874 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215883 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215893 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215903 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215919 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215938 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215948 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215980 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 11:41:54.215994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.215994 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 11:41:54.216744 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.216732 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 11:41:54.216744 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.216746 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 11:41:54.220303 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.220136 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t79t2" Apr 20 11:41:54.220585 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.220571 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 11:41:54.220633 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.220618 2578 server.go:1295] "Started kubelet" Apr 20 11:41:54.220764 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.220718 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 11:41:54.220824 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.220795 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 11:41:54.220866 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.220722 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 11:41:54.221222 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.221199 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-125.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 11:41:54.221480 ip-10-0-133-125 systemd[1]: Started Kubernetes Kubelet. Apr 20 11:41:54.221908 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.221848 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 11:41:54.222008 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.221984 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-125.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 11:41:54.222008 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.221986 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 11:41:54.222428 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.222414 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 20 11:41:54.226941 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.226923 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 11:41:54.227432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.227416 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 11:41:54.228089 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.228071 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 11:41:54.228089 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.228091 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 11:41:54.228324 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.228099 2578 factory.go:55] Registering systemd factory Apr 20 11:41:54.228324 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.228147 2578 factory.go:223] Registration of the systemd container factory successfully Apr 20 11:41:54.228324 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.228178 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 11:41:54.228324 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.228259 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 20 11:41:54.228324 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.228268 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 20 11:41:54.228571 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.228388 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t79t2" Apr 20 11:41:54.228752 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.228729 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:54.229548 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.229529 2578 factory.go:153] Registering CRI-O factory Apr 20 11:41:54.229548 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.229547 2578 factory.go:223] Registration of the crio container factory successfully Apr 20 11:41:54.229688 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.229593 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 11:41:54.229688 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.229608 2578 factory.go:103] Registering Raw factory Apr 20 11:41:54.229688 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.229633 2578 manager.go:1196] Started watching for new ooms in manager Apr 20 11:41:54.230046 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.230016 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 11:41:54.230046 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.230034 2578 manager.go:319] Starting recovery of all containers Apr 20 11:41:54.231341 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.230517 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-125.ec2.internal.18a80de147c5765c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-125.ec2.internal,UID:ip-10-0-133-125.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-125.ec2.internal,},FirstTimestamp:2026-04-20 11:41:54.220586588 +0000 UTC m=+0.344005825,LastTimestamp:2026-04-20 11:41:54.220586588 +0000 UTC m=+0.344005825,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-125.ec2.internal,}" Apr 20 11:41:54.231461 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.231441 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 11:41:54.241316 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.241293 2578 manager.go:324] Recovery completed Apr 20 11:41:54.245748 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.245726 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-125.ec2.internal\" not found" node="ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.246937 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.246924 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:54.249301 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.249287 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:54.249367 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.249319 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:54.249367 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.249330 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:54.249820 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.249809 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 11:41:54.249858 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.249821 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 11:41:54.249858 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.249838 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 20 11:41:54.252153 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.252141 2578 policy_none.go:49] "None policy: Start" Apr 20 11:41:54.252197 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.252158 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 11:41:54.252197 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.252168 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 20 11:41:54.294839 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.294820 2578 manager.go:341] "Starting Device Plugin manager" Apr 20 11:41:54.296368 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.294908 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 11:41:54.296368 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.294922 2578 server.go:85] "Starting device plugin registration server" Apr 20 11:41:54.296368 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.295179 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 11:41:54.296368 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.295190 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 11:41:54.296368 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.295322 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 11:41:54.296368 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.295419 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 11:41:54.296368 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.295429 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 11:41:54.296368 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.296000 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 11:41:54.296368 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.296065 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:54.329172 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.329141 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 11:41:54.330519 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.330500 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 11:41:54.330638 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.330527 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 11:41:54.330638 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.330547 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 11:41:54.330638 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.330556 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 11:41:54.330638 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.330594 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 11:41:54.334812 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.334789 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:54.396002 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.395907 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:54.397118 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.397100 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:54.397218 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.397131 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:54.397218 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.397141 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:54.397218 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.397165 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.403376 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.403358 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.403461 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.403382 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-125.ec2.internal\": node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:54.419814 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.419789 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:54.430888 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.430857 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal"] Apr 20 11:41:54.430962 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.430955 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:54.432380 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.432366 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:54.432464 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.432393 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:54.432464 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.432403 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:54.433565 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.433552 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:54.434002 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.433982 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.434044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.434016 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:54.434356 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.434341 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:54.434416 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.434376 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:54.434416 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.434391 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:54.434668 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.434653 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:54.434730 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.434682 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:54.434730 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.434695 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:54.435575 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.435562 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.435626 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.435585 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 11:41:54.436281 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.436266 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientMemory" Apr 20 11:41:54.436347 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.436292 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 11:41:54.436347 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.436306 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeHasSufficientPID" Apr 20 11:41:54.464914 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.464880 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-125.ec2.internal\" not found" node="ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.469411 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.469392 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-125.ec2.internal\" not found" node="ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.520428 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.520403 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:54.620786 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.620757 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:54.629125 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.629105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f84e17a26108ba2d3be8bac3d44320a4-config\") pod \"kube-apiserver-proxy-ip-10-0-133-125.ec2.internal\" (UID: \"f84e17a26108ba2d3be8bac3d44320a4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.629192 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.629133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5989097bed94e3d10ba9f36bf71c38b3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal\" (UID: \"5989097bed94e3d10ba9f36bf71c38b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.629192 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.629157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5989097bed94e3d10ba9f36bf71c38b3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal\" (UID: \"5989097bed94e3d10ba9f36bf71c38b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.721187 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.721099 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:54.729393 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.729366 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f84e17a26108ba2d3be8bac3d44320a4-config\") pod \"kube-apiserver-proxy-ip-10-0-133-125.ec2.internal\" (UID: \"f84e17a26108ba2d3be8bac3d44320a4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.729461 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.729402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5989097bed94e3d10ba9f36bf71c38b3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal\" (UID: \"5989097bed94e3d10ba9f36bf71c38b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.729461 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.729421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5989097bed94e3d10ba9f36bf71c38b3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal\" (UID: \"5989097bed94e3d10ba9f36bf71c38b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.729525 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.729465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f84e17a26108ba2d3be8bac3d44320a4-config\") pod \"kube-apiserver-proxy-ip-10-0-133-125.ec2.internal\" (UID: \"f84e17a26108ba2d3be8bac3d44320a4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.729560 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.729537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5989097bed94e3d10ba9f36bf71c38b3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal\" (UID: \"5989097bed94e3d10ba9f36bf71c38b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.729591 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.729562 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5989097bed94e3d10ba9f36bf71c38b3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal\" (UID: \"5989097bed94e3d10ba9f36bf71c38b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.766520 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.766490 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.772106 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:54.772089 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal" Apr 20 11:41:54.821400 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.821368 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:54.921848 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:54.921812 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:55.022375 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:55.022312 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:55.078919 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.078895 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:55.122838 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:55.122791 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:55.147273 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.147222 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 11:41:55.147914 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.147380 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 11:41:55.147914 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.147418 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 11:41:55.223489 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:55.223459 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:55.227077 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.227063 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 11:41:55.231429 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.231362 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 11:36:54 +0000 UTC" deadline="2028-02-05 19:43:30.332483054 +0000 UTC" Apr 20 11:41:55.231429 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.231429 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15752h1m35.101059314s" Apr 20 11:41:55.242891 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.242870 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 11:41:55.272295 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.272255 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gwgt2" Apr 20 11:41:55.281631 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.281574 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gwgt2" Apr 20 11:41:55.290892 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:55.290865 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf84e17a26108ba2d3be8bac3d44320a4.slice/crio-ab0266de90d017a16f25e90c8ab4ffe7ea1017a5e7687a2b9255e283681a63dc WatchSource:0}: Error finding container ab0266de90d017a16f25e90c8ab4ffe7ea1017a5e7687a2b9255e283681a63dc: Status 404 returned error can't find the container with id ab0266de90d017a16f25e90c8ab4ffe7ea1017a5e7687a2b9255e283681a63dc Apr 20 11:41:55.295526 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.295511 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 11:41:55.315653 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.315624 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:55.323806 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:55.323773 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:55.330426 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:55.330404 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5989097bed94e3d10ba9f36bf71c38b3.slice/crio-e392ce29e14b00b158d573604147ba81f16d2796e154e0a5e52f36d6fb670ab4 WatchSource:0}: Error finding container e392ce29e14b00b158d573604147ba81f16d2796e154e0a5e52f36d6fb670ab4: Status 404 returned error can't find the container with id e392ce29e14b00b158d573604147ba81f16d2796e154e0a5e52f36d6fb670ab4 Apr 20 11:41:55.333286 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.333225 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" event={"ID":"5989097bed94e3d10ba9f36bf71c38b3","Type":"ContainerStarted","Data":"e392ce29e14b00b158d573604147ba81f16d2796e154e0a5e52f36d6fb670ab4"} Apr 20 11:41:55.334342 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.334321 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal" event={"ID":"f84e17a26108ba2d3be8bac3d44320a4","Type":"ContainerStarted","Data":"ab0266de90d017a16f25e90c8ab4ffe7ea1017a5e7687a2b9255e283681a63dc"} Apr 20 11:41:55.424169 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:55.424114 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:55.524655 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:55.524610 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:55.625276 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:55.625190 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:55.726015 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:55.725980 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-125.ec2.internal\" not found" Apr 20 11:41:55.811633 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.811600 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:55.827733 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.827693 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal" Apr 20 11:41:55.840098 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.840056 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 11:41:55.841083 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.841059 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" Apr 20 11:41:55.851807 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:55.851694 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 11:41:56.210640 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.210612 2578 apiserver.go:52] "Watching apiserver" Apr 20 11:41:56.219719 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.219689 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 11:41:56.220028 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.220002 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8q26d","kube-system/konnectivity-agent-nxf6x","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z","openshift-image-registry/node-ca-ltzgg","openshift-multus/multus-additional-cni-plugins-vg6zv","openshift-network-operator/iptables-alerter-s8s8p","kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal","openshift-cluster-node-tuning-operator/tuned-spb5n","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal","openshift-multus/multus-s6chk","openshift-multus/network-metrics-daemon-4lcnh","openshift-network-diagnostics/network-check-target-qzgrd"] Apr 20 11:41:56.221844 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.221822 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.222716 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.222691 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:41:56.223844 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.223822 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.224755 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.224736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 11:41:56.224982 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.224965 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.225066 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.225013 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 11:41:56.225971 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.225948 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 11:41:56.226085 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.226031 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.226148 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.226126 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 11:41:56.226717 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.226702 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 11:41:56.226857 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.226839 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 11:41:56.226937 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.226915 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-gbzm5\"" Apr 20 11:41:56.227052 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.227031 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 11:41:56.227154 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.227137 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 11:41:56.227231 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.227218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.227974 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.227984 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.227993 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sgnmv\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.228074 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.228075 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.228164 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ttmbb\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.228398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.228461 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-s669b\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.228468 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.228473 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.229159 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 11:41:56.229375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.229193 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 11:41:56.230456 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.230429 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hzhl5\"" Apr 20 11:41:56.230683 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.230658 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 11:41:56.232757 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.232740 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 11:41:56.233221 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.233197 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:41:56.233530 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.233512 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 11:41:56.233619 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.233600 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gdngf\"" Apr 20 11:41:56.233986 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.233967 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z5tr9\"" Apr 20 11:41:56.234050 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.234012 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:41:56.234050 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.234021 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 11:41:56.234137 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.234092 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 11:41:56.234351 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.234288 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 11:41:56.236414 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.236394 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.236520 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.236478 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:56.236586 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.236553 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:41:56.237802 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.237780 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-modprobe-d\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.237907 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.237803 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:41:56.237907 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.237816 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-os-release\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.237907 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.237839 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-kubelet\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.237907 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.237861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-node-log\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.237907 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.237861 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:41:56.237907 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.237906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-cnibin\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.237929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6276a0c3-e955-4691-b383-18751303b9e2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.237980 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f7fd4abe-2e90-42e4-b4e1-6d43241cd39a-konnectivity-ca\") pod \"konnectivity-agent-nxf6x\" (UID: \"f7fd4abe-2e90-42e4-b4e1-6d43241cd39a\") " pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-run\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238032 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbnnm\" (UniqueName: \"kubernetes.io/projected/6316a3d4-5236-4574-91c5-ccd6e85aee53-kube-api-access-fbnnm\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f7fd4abe-2e90-42e4-b4e1-6d43241cd39a-agent-certs\") pod \"konnectivity-agent-nxf6x\" (UID: \"f7fd4abe-2e90-42e4-b4e1-6d43241cd39a\") " pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-etc-selinux\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-systemd\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238106 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72feb6a6-4564-4f5a-a26a-008d43db43b7-host-slash\") pod \"iptables-alerter-s8s8p\" (UID: \"72feb6a6-4564-4f5a-a26a-008d43db43b7\") " pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6933a359-bd42-4dcd-94d7-cc72b948509c-env-overrides\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238142 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-device-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238166 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-lib-modules\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-slash\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.238209 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238212 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-var-lib-openvswitch\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-tuned\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238319 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-sys\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-sysctl-d\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238379 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-system-cni-dir\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-systemd-units\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238421 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-run-systemd\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238441 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh7bf\" (UniqueName: \"kubernetes.io/projected/6933a359-bd42-4dcd-94d7-cc72b948509c-kube-api-access-wh7bf\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-registration-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6316a3d4-5236-4574-91c5-ccd6e85aee53-tmp\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238496 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgr5k\" (UniqueName: \"kubernetes.io/projected/72feb6a6-4564-4f5a-a26a-008d43db43b7-kube-api-access-wgr5k\") pod \"iptables-alerter-s8s8p\" (UID: \"72feb6a6-4564-4f5a-a26a-008d43db43b7\") " pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6933a359-bd42-4dcd-94d7-cc72b948509c-ovnkube-config\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238523 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-socket-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6276a0c3-e955-4691-b383-18751303b9e2-cni-binary-copy\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.238789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238563 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238592 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-host\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6276a0c3-e955-4691-b383-18751303b9e2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtdh\" (UniqueName: \"kubernetes.io/projected/6276a0c3-e955-4691-b383-18751303b9e2-kube-api-access-tgtdh\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-run-openvswitch\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238720 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-run-ovn\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-cni-bin\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-cni-netd\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5ch4\" (UniqueName: \"kubernetes.io/projected/f6bd444d-179f-465d-9358-90444a0bd1b0-kube-api-access-x5ch4\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238827 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-var-lib-kubelet\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238849 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-log-socket\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238876 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-run-netns\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238889 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d-serviceca\") pod \"node-ca-ltzgg\" (UID: \"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d\") " pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238905 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-sysctl-conf\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238908 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238951 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xnlkj\"" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238963 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-etc-openvswitch\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.239431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.240081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.238999 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6933a359-bd42-4dcd-94d7-cc72b948509c-ovnkube-script-lib\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.240081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.239017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-sys-fs\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.240081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.239031 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/72feb6a6-4564-4f5a-a26a-008d43db43b7-iptables-alerter-script\") pod \"iptables-alerter-s8s8p\" (UID: \"72feb6a6-4564-4f5a-a26a-008d43db43b7\") " pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.240081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.239049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6933a359-bd42-4dcd-94d7-cc72b948509c-ovn-node-metrics-cert\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.240081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.239064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d-host\") pod \"node-ca-ltzgg\" (UID: \"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d\") " pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.240081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.239087 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcf6q\" (UniqueName: \"kubernetes.io/projected/8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d-kube-api-access-xcf6q\") pod \"node-ca-ltzgg\" (UID: \"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d\") " pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.240081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.239116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-kubernetes\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.240081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.239134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-sysconfig\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.282601 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.282560 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 11:36:55 +0000 UTC" deadline="2027-10-08 00:24:54.287840825 +0000 UTC" Apr 20 11:41:56.282601 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.282597 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12852h42m58.005247302s" Apr 20 11:41:56.312858 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.312824 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:56.329195 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.329165 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 11:41:56.340117 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-sysctl-d\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.340117 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340118 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-system-cni-dir\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.340348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-systemd-units\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.340348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-run-systemd\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.340348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh7bf\" (UniqueName: \"kubernetes.io/projected/6933a359-bd42-4dcd-94d7-cc72b948509c-kube-api-access-wh7bf\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.340348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-registration-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.340348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-system-cni-dir\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.340348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340231 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6316a3d4-5236-4574-91c5-ccd6e85aee53-tmp\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.340348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340298 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-run-systemd\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.340348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340304 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-sysctl-d\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.340721 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-systemd-units\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.340721 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-registration-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.340721 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340581 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 11:41:56.340902 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340306 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:56.340902 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgr5k\" (UniqueName: \"kubernetes.io/projected/72feb6a6-4564-4f5a-a26a-008d43db43b7-kube-api-access-wgr5k\") pod \"iptables-alerter-s8s8p\" (UID: \"72feb6a6-4564-4f5a-a26a-008d43db43b7\") " pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.340902 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6933a359-bd42-4dcd-94d7-cc72b948509c-ovnkube-config\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.340902 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-socket-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.340902 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m55k7\" (UniqueName: \"kubernetes.io/projected/d9165296-57f0-4590-ad83-189871356a1a-kube-api-access-m55k7\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:56.340902 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6276a0c3-e955-4691-b383-18751303b9e2-cni-binary-copy\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.340902 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340912 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.341198 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.340965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-host\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.341198 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-multus-socket-dir-parent\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.341317 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6276a0c3-e955-4691-b383-18751303b9e2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.341317 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtdh\" (UniqueName: \"kubernetes.io/projected/6276a0c3-e955-4691-b383-18751303b9e2-kube-api-access-tgtdh\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.341317 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341302 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-run-openvswitch\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341415 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341327 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-run-ovn\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341415 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-cni-bin\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341415 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-cni-netd\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341415 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5ch4\" (UniqueName: \"kubernetes.io/projected/f6bd444d-179f-465d-9358-90444a0bd1b0-kube-api-access-x5ch4\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.341543 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-var-lib-kubelet\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.341543 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-log-socket\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341543 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-run-netns\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341543 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341500 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d-serviceca\") pod \"node-ca-ltzgg\" (UID: \"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d\") " pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.341543 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6933a359-bd42-4dcd-94d7-cc72b948509c-ovnkube-config\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341543 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-sysctl-conf\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.341721 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341551 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-multus-cni-dir\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.341721 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341578 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-multus-conf-dir\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.341721 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341631 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6276a0c3-e955-4691-b383-18751303b9e2-cni-binary-copy\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.341721 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341707 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-etc-openvswitch\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341753 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-socket-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.341846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6933a359-bd42-4dcd-94d7-cc72b948509c-ovnkube-script-lib\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.341846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341777 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-sysctl-conf\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.341846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-sys-fs\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.341846 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341827 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-host\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-system-cni-dir\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341865 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-var-lib-kubelet\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-sys-fs\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-var-lib-cni-multus\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-hostroot\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-run-netns\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/72feb6a6-4564-4f5a-a26a-008d43db43b7-iptables-alerter-script\") pod \"iptables-alerter-s8s8p\" (UID: \"72feb6a6-4564-4f5a-a26a-008d43db43b7\") " pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-log-socket\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.341980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6933a359-bd42-4dcd-94d7-cc72b948509c-ovn-node-metrics-cert\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.342044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342006 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-run-ovn\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.342428 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-etc-openvswitch\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.342428 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342119 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.342428 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342141 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d-serviceca\") pod \"node-ca-ltzgg\" (UID: \"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d\") " pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.342428 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342199 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-run-openvswitch\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.342428 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6276a0c3-e955-4691-b383-18751303b9e2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.342428 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.342677 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342260 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-cni-netd\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.342752 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d-host\") pod \"node-ca-ltzgg\" (UID: \"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d\") " pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.342811 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcf6q\" (UniqueName: \"kubernetes.io/projected/8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d-kube-api-access-xcf6q\") pod \"node-ca-ltzgg\" (UID: \"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d\") " pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.342888 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-kubernetes\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.342922 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342908 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-cni-bin\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.342956 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d-host\") pod \"node-ca-ltzgg\" (UID: \"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d\") " pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.342984 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/72feb6a6-4564-4f5a-a26a-008d43db43b7-iptables-alerter-script\") pod \"iptables-alerter-s8s8p\" (UID: \"72feb6a6-4564-4f5a-a26a-008d43db43b7\") " pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.343469 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.342814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-kubernetes\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.343552 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343517 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-cnibin\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.343630 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6dd0225-09bd-4349-9632-48a466010b96-multus-daemon-config\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.343727 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-etc-kubernetes\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.343727 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-sysconfig\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.343829 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343724 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-var-lib-kubelet\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.343829 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-modprobe-d\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.343829 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6933a359-bd42-4dcd-94d7-cc72b948509c-ovnkube-script-lib\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.343962 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-sysconfig\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.343962 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-os-release\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.343962 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-kubelet\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.343962 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343943 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-node-log\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.344121 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.343978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-cnibin\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.344121 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6276a0c3-e955-4691-b383-18751303b9e2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.344121 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f7fd4abe-2e90-42e4-b4e1-6d43241cd39a-konnectivity-ca\") pod \"konnectivity-agent-nxf6x\" (UID: \"f7fd4abe-2e90-42e4-b4e1-6d43241cd39a\") " pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:41:56.344121 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-run\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.344121 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-kubelet\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.344121 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-modprobe-d\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.344121 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbnnm\" (UniqueName: \"kubernetes.io/projected/6316a3d4-5236-4574-91c5-ccd6e85aee53-kube-api-access-fbnnm\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.344121 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-os-release\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.344444 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-run-multus-certs\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.344444 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f7fd4abe-2e90-42e4-b4e1-6d43241cd39a-agent-certs\") pod \"konnectivity-agent-nxf6x\" (UID: \"f7fd4abe-2e90-42e4-b4e1-6d43241cd39a\") " pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:41:56.344444 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344225 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-etc-selinux\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.344444 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-systemd\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.344444 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344312 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6dd0225-09bd-4349-9632-48a466010b96-cni-binary-copy\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.344444 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72feb6a6-4564-4f5a-a26a-008d43db43b7-host-slash\") pod \"iptables-alerter-s8s8p\" (UID: \"72feb6a6-4564-4f5a-a26a-008d43db43b7\") " pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.344444 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344387 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6933a359-bd42-4dcd-94d7-cc72b948509c-env-overrides\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.344444 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344422 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-device-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344454 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-lib-modules\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-slash\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344517 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmkkr\" (UniqueName: \"kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr\") pod \"network-check-target-qzgrd\" (UID: \"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea\") " pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-var-lib-openvswitch\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344618 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-tuned\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344648 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-os-release\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-run\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-run-k8s-cni-cncf-io\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-var-lib-cni-bin\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344735 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6276a0c3-e955-4691-b383-18751303b9e2-cnibin\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.344765 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-sys\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-run-netns\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-systemd\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q975r\" (UniqueName: \"kubernetes.io/projected/f6dd0225-09bd-4349-9632-48a466010b96-kube-api-access-q975r\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344948 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-var-lib-openvswitch\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-run-ovn-kubernetes\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.344788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-host-slash\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.345058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-sys\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.345108 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-etc-selinux\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.345170 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/72feb6a6-4564-4f5a-a26a-008d43db43b7-host-slash\") pod \"iptables-alerter-s8s8p\" (UID: \"72feb6a6-4564-4f5a-a26a-008d43db43b7\") " pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.345284 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.345176 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6276a0c3-e955-4691-b383-18751303b9e2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.345782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.345348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f7fd4abe-2e90-42e4-b4e1-6d43241cd39a-konnectivity-ca\") pod \"konnectivity-agent-nxf6x\" (UID: \"f7fd4abe-2e90-42e4-b4e1-6d43241cd39a\") " pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:41:56.345782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.345426 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6bd444d-179f-465d-9358-90444a0bd1b0-device-dir\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.345782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.345521 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6316a3d4-5236-4574-91c5-ccd6e85aee53-lib-modules\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.345782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.345567 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6933a359-bd42-4dcd-94d7-cc72b948509c-node-log\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.345782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.345600 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6933a359-bd42-4dcd-94d7-cc72b948509c-env-overrides\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.346072 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.346050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6933a359-bd42-4dcd-94d7-cc72b948509c-ovn-node-metrics-cert\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.348943 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.348915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6316a3d4-5236-4574-91c5-ccd6e85aee53-etc-tuned\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.349466 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.349434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6316a3d4-5236-4574-91c5-ccd6e85aee53-tmp\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.349874 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.349807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f7fd4abe-2e90-42e4-b4e1-6d43241cd39a-agent-certs\") pod \"konnectivity-agent-nxf6x\" (UID: \"f7fd4abe-2e90-42e4-b4e1-6d43241cd39a\") " pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:41:56.352880 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.352854 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh7bf\" (UniqueName: \"kubernetes.io/projected/6933a359-bd42-4dcd-94d7-cc72b948509c-kube-api-access-wh7bf\") pod \"ovnkube-node-8q26d\" (UID: \"6933a359-bd42-4dcd-94d7-cc72b948509c\") " pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.353648 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.353614 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcf6q\" (UniqueName: \"kubernetes.io/projected/8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d-kube-api-access-xcf6q\") pod \"node-ca-ltzgg\" (UID: \"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d\") " pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.353648 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.353640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgr5k\" (UniqueName: \"kubernetes.io/projected/72feb6a6-4564-4f5a-a26a-008d43db43b7-kube-api-access-wgr5k\") pod \"iptables-alerter-s8s8p\" (UID: \"72feb6a6-4564-4f5a-a26a-008d43db43b7\") " pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.354303 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.354281 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbnnm\" (UniqueName: \"kubernetes.io/projected/6316a3d4-5236-4574-91c5-ccd6e85aee53-kube-api-access-fbnnm\") pod \"tuned-spb5n\" (UID: \"6316a3d4-5236-4574-91c5-ccd6e85aee53\") " pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.354945 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.354926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5ch4\" (UniqueName: \"kubernetes.io/projected/f6bd444d-179f-465d-9358-90444a0bd1b0-kube-api-access-x5ch4\") pod \"aws-ebs-csi-driver-node-zzv4z\" (UID: \"f6bd444d-179f-465d-9358-90444a0bd1b0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.355277 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.355255 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtdh\" (UniqueName: \"kubernetes.io/projected/6276a0c3-e955-4691-b383-18751303b9e2-kube-api-access-tgtdh\") pod \"multus-additional-cni-plugins-vg6zv\" (UID: \"6276a0c3-e955-4691-b383-18751303b9e2\") " pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.445354 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-cnibin\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445517 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6dd0225-09bd-4349-9632-48a466010b96-multus-daemon-config\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445517 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445385 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-etc-kubernetes\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445517 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-var-lib-kubelet\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445517 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-run-multus-certs\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445517 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445456 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-cnibin\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445517 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445502 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-etc-kubernetes\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-run-multus-certs\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445465 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6dd0225-09bd-4349-9632-48a466010b96-cni-binary-copy\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkkr\" (UniqueName: \"kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr\") pod \"network-check-target-qzgrd\" (UID: \"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea\") " pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445614 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-var-lib-kubelet\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-os-release\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-run-k8s-cni-cncf-io\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-var-lib-cni-bin\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-run-netns\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445787 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q975r\" (UniqueName: \"kubernetes.io/projected/f6dd0225-09bd-4349-9632-48a466010b96-kube-api-access-q975r\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:56.445860 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m55k7\" (UniqueName: \"kubernetes.io/projected/d9165296-57f0-4590-ad83-189871356a1a-kube-api-access-m55k7\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f6dd0225-09bd-4349-9632-48a466010b96-multus-daemon-config\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-var-lib-cni-bin\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f6dd0225-09bd-4349-9632-48a466010b96-cni-binary-copy\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-multus-socket-dir-parent\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-run-netns\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-multus-cni-dir\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.445986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-run-k8s-cni-cncf-io\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-multus-conf-dir\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-system-cni-dir\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446032 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-multus-socket-dir-parent\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-var-lib-cni-multus\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-hostroot\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-os-release\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-hostroot\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.446123 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446121 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-multus-conf-dir\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-multus-cni-dir\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446151 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-system-cni-dir\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.446154 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f6dd0225-09bd-4349-9632-48a466010b96-host-var-lib-cni-multus\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.446963 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.446194 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs podName:d9165296-57f0-4590-ad83-189871356a1a nodeName:}" failed. No retries permitted until 2026-04-20 11:41:56.946177085 +0000 UTC m=+3.069596327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs") pod "network-metrics-daemon-4lcnh" (UID: "d9165296-57f0-4590-ad83-189871356a1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:56.453483 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.453457 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:41:56.453483 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.453485 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:41:56.453679 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.453498 2578 projected.go:194] Error preparing data for projected volume kube-api-access-cmkkr for pod openshift-network-diagnostics/network-check-target-qzgrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:56.453679 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.453565 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr podName:bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea nodeName:}" failed. No retries permitted until 2026-04-20 11:41:56.953550223 +0000 UTC m=+3.076969468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cmkkr" (UniqueName: "kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr") pod "network-check-target-qzgrd" (UID: "bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:56.457305 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.457280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m55k7\" (UniqueName: \"kubernetes.io/projected/d9165296-57f0-4590-ad83-189871356a1a-kube-api-access-m55k7\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:56.457731 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.457716 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q975r\" (UniqueName: \"kubernetes.io/projected/f6dd0225-09bd-4349-9632-48a466010b96-kube-api-access-q975r\") pod \"multus-s6chk\" (UID: \"f6dd0225-09bd-4349-9632-48a466010b96\") " pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.537652 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.537565 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:41:56.544447 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.544422 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:41:56.552150 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.552127 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" Apr 20 11:41:56.557810 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.557783 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ltzgg" Apr 20 11:41:56.564418 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.564397 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" Apr 20 11:41:56.570950 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.570927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s8s8p" Apr 20 11:41:56.577516 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.577499 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-spb5n" Apr 20 11:41:56.583034 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.583013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s6chk" Apr 20 11:41:56.940986 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:56.940814 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72feb6a6_4564_4f5a_a26a_008d43db43b7.slice/crio-5ebb497e82dd262961010d7208bffb4a13c8c987830c8f6b37bfe4f4f181a2e8 WatchSource:0}: Error finding container 5ebb497e82dd262961010d7208bffb4a13c8c987830c8f6b37bfe4f4f181a2e8: Status 404 returned error can't find the container with id 5ebb497e82dd262961010d7208bffb4a13c8c987830c8f6b37bfe4f4f181a2e8 Apr 20 11:41:56.941531 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:56.941504 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6dd0225_09bd_4349_9632_48a466010b96.slice/crio-67755cfce846fc1d3bf43db59a0eaa4f558c9c8e278745f9277bf15105f7bfca WatchSource:0}: Error finding container 67755cfce846fc1d3bf43db59a0eaa4f558c9c8e278745f9277bf15105f7bfca: Status 404 returned error can't find the container with id 67755cfce846fc1d3bf43db59a0eaa4f558c9c8e278745f9277bf15105f7bfca Apr 20 11:41:56.942758 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:56.942733 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6316a3d4_5236_4574_91c5_ccd6e85aee53.slice/crio-68d34c89ac2b7d879b1a1ce8610eec1180fc7be4e07627ea457256eaee8d94f6 WatchSource:0}: Error finding container 68d34c89ac2b7d879b1a1ce8610eec1180fc7be4e07627ea457256eaee8d94f6: Status 404 returned error can't find the container with id 68d34c89ac2b7d879b1a1ce8610eec1180fc7be4e07627ea457256eaee8d94f6 Apr 20 11:41:56.944301 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:56.944281 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6bd444d_179f_465d_9358_90444a0bd1b0.slice/crio-b1055ab9fa339bd3276fdedbbc7e04eca7a27acf8113a622b7c93f7723fe57de WatchSource:0}: Error finding container b1055ab9fa339bd3276fdedbbc7e04eca7a27acf8113a622b7c93f7723fe57de: Status 404 returned error can't find the container with id b1055ab9fa339bd3276fdedbbc7e04eca7a27acf8113a622b7c93f7723fe57de Apr 20 11:41:56.946329 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:56.946303 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e78ca5b_c7fc_4c32_ae65_ccfc944fc66d.slice/crio-a6c1a3c0834211d323d82e7b36f426a68f38f7e5a402966b7bf7b8257775140d WatchSource:0}: Error finding container a6c1a3c0834211d323d82e7b36f426a68f38f7e5a402966b7bf7b8257775140d: Status 404 returned error can't find the container with id a6c1a3c0834211d323d82e7b36f426a68f38f7e5a402966b7bf7b8257775140d Apr 20 11:41:56.947657 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:56.947575 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6276a0c3_e955_4691_b383_18751303b9e2.slice/crio-abafa2ad8ae6d7f059a6f4ca75c4cb7ff821c6901865a7bd77e5c118613505c5 WatchSource:0}: Error finding container abafa2ad8ae6d7f059a6f4ca75c4cb7ff821c6901865a7bd77e5c118613505c5: Status 404 returned error can't find the container with id abafa2ad8ae6d7f059a6f4ca75c4cb7ff821c6901865a7bd77e5c118613505c5 Apr 20 11:41:56.948600 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:56.948578 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6933a359_bd42_4dcd_94d7_cc72b948509c.slice/crio-80228108c34f01208e736c55645146e3222f41bbcf12b9e965885d3464e457a9 WatchSource:0}: Error finding container 80228108c34f01208e736c55645146e3222f41bbcf12b9e965885d3464e457a9: Status 404 returned error can't find the container with id 80228108c34f01208e736c55645146e3222f41bbcf12b9e965885d3464e457a9 Apr 20 11:41:56.949489 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:41:56.949466 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7fd4abe_2e90_42e4_b4e1_6d43241cd39a.slice/crio-0563c57e8041e9d0db9108e51e31282317238107030365ec7c436605ee094a2e WatchSource:0}: Error finding container 0563c57e8041e9d0db9108e51e31282317238107030365ec7c436605ee094a2e: Status 404 returned error can't find the container with id 0563c57e8041e9d0db9108e51e31282317238107030365ec7c436605ee094a2e Apr 20 11:41:56.949702 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:56.949681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:56.949874 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.949839 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:56.949905 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:56.949887 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs podName:d9165296-57f0-4590-ad83-189871356a1a nodeName:}" failed. No retries permitted until 2026-04-20 11:41:57.949870116 +0000 UTC m=+4.073289347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs") pod "network-metrics-daemon-4lcnh" (UID: "d9165296-57f0-4590-ad83-189871356a1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:57.050884 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.050859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkkr\" (UniqueName: \"kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr\") pod \"network-check-target-qzgrd\" (UID: \"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea\") " pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:41:57.051003 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:57.050988 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:41:57.051003 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:57.051002 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:41:57.051084 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:57.051011 2578 projected.go:194] Error preparing data for projected volume kube-api-access-cmkkr for pod openshift-network-diagnostics/network-check-target-qzgrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:57.051084 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:57.051054 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr podName:bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea nodeName:}" failed. No retries permitted until 2026-04-20 11:41:58.051041287 +0000 UTC m=+4.174460512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmkkr" (UniqueName: "kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr") pod "network-check-target-qzgrd" (UID: "bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:57.283658 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.283536 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 11:36:55 +0000 UTC" deadline="2027-09-27 13:10:34.234681911 +0000 UTC" Apr 20 11:41:57.283658 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.283581 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12601h28m36.951104185s" Apr 20 11:41:57.342262 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.342209 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6chk" event={"ID":"f6dd0225-09bd-4349-9632-48a466010b96","Type":"ContainerStarted","Data":"67755cfce846fc1d3bf43db59a0eaa4f558c9c8e278745f9277bf15105f7bfca"} Apr 20 11:41:57.345226 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.345171 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-spb5n" event={"ID":"6316a3d4-5236-4574-91c5-ccd6e85aee53","Type":"ContainerStarted","Data":"68d34c89ac2b7d879b1a1ce8610eec1180fc7be4e07627ea457256eaee8d94f6"} Apr 20 11:41:57.351923 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.351888 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal" event={"ID":"f84e17a26108ba2d3be8bac3d44320a4","Type":"ContainerStarted","Data":"1b3c825f46e51af005d9bad7bbcdf9e8d9abaccecbafde59e51ec998cf10aa33"} Apr 20 11:41:57.355325 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.355270 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" event={"ID":"f6bd444d-179f-465d-9358-90444a0bd1b0","Type":"ContainerStarted","Data":"b1055ab9fa339bd3276fdedbbc7e04eca7a27acf8113a622b7c93f7723fe57de"} Apr 20 11:41:57.369749 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.364432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s8s8p" event={"ID":"72feb6a6-4564-4f5a-a26a-008d43db43b7","Type":"ContainerStarted","Data":"5ebb497e82dd262961010d7208bffb4a13c8c987830c8f6b37bfe4f4f181a2e8"} Apr 20 11:41:57.371047 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.370977 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nxf6x" event={"ID":"f7fd4abe-2e90-42e4-b4e1-6d43241cd39a","Type":"ContainerStarted","Data":"0563c57e8041e9d0db9108e51e31282317238107030365ec7c436605ee094a2e"} Apr 20 11:41:57.379437 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.379400 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerStarted","Data":"80228108c34f01208e736c55645146e3222f41bbcf12b9e965885d3464e457a9"} Apr 20 11:41:57.383183 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.383136 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" event={"ID":"6276a0c3-e955-4691-b383-18751303b9e2","Type":"ContainerStarted","Data":"abafa2ad8ae6d7f059a6f4ca75c4cb7ff821c6901865a7bd77e5c118613505c5"} Apr 20 11:41:57.388921 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.388895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ltzgg" event={"ID":"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d","Type":"ContainerStarted","Data":"a6c1a3c0834211d323d82e7b36f426a68f38f7e5a402966b7bf7b8257775140d"} Apr 20 11:41:57.959596 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:57.959564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:57.959726 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:57.959703 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:57.959781 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:57.959768 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs podName:d9165296-57f0-4590-ad83-189871356a1a nodeName:}" failed. No retries permitted until 2026-04-20 11:41:59.959749834 +0000 UTC m=+6.083169075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs") pod "network-metrics-daemon-4lcnh" (UID: "d9165296-57f0-4590-ad83-189871356a1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:58.060804 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:58.060758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkkr\" (UniqueName: \"kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr\") pod \"network-check-target-qzgrd\" (UID: \"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea\") " pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:41:58.060994 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:58.060941 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:41:58.060994 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:58.060960 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:41:58.060994 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:58.060973 2578 projected.go:194] Error preparing data for projected volume kube-api-access-cmkkr for pod openshift-network-diagnostics/network-check-target-qzgrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:58.061147 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:58.061037 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr podName:bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea nodeName:}" failed. No retries permitted until 2026-04-20 11:42:00.061017018 +0000 UTC m=+6.184436279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmkkr" (UniqueName: "kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr") pod "network-check-target-qzgrd" (UID: "bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:41:58.094469 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:58.094390 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 11:41:58.332498 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:58.332467 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:58.332955 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:58.332616 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:41:58.332955 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:58.332467 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:41:58.332955 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:58.332728 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:41:58.413703 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:58.412440 2578 generic.go:358] "Generic (PLEG): container finished" podID="5989097bed94e3d10ba9f36bf71c38b3" containerID="16f0fc9ee63fe871f77460d446bd36e84455e23a375c3902cbb0e31ada069dd8" exitCode=0 Apr 20 11:41:58.413703 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:58.413383 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" event={"ID":"5989097bed94e3d10ba9f36bf71c38b3","Type":"ContainerDied","Data":"16f0fc9ee63fe871f77460d446bd36e84455e23a375c3902cbb0e31ada069dd8"} Apr 20 11:41:58.428070 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:58.428007 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-125.ec2.internal" podStartSLOduration=3.427987889 podStartE2EDuration="3.427987889s" podCreationTimestamp="2026-04-20 11:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:41:57.367139147 +0000 UTC m=+3.490558397" watchObservedRunningTime="2026-04-20 11:41:58.427987889 +0000 UTC m=+4.551407139" Apr 20 11:41:59.418805 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:59.418747 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" event={"ID":"5989097bed94e3d10ba9f36bf71c38b3","Type":"ContainerStarted","Data":"2549aa1d2bcf46249c09c6974a6469c311643bd09e0116421517c36f2a69e0ee"} Apr 20 11:41:59.978375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:41:59.978338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:41:59.978560 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:59.978501 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:41:59.978560 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:41:59.978559 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs podName:d9165296-57f0-4590-ad83-189871356a1a nodeName:}" failed. No retries permitted until 2026-04-20 11:42:03.978544635 +0000 UTC m=+10.101963860 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs") pod "network-metrics-daemon-4lcnh" (UID: "d9165296-57f0-4590-ad83-189871356a1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:00.080407 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:00.079703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkkr\" (UniqueName: \"kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr\") pod \"network-check-target-qzgrd\" (UID: \"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea\") " pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:00.080407 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:00.079881 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:00.080407 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:00.079905 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:00.080407 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:00.079917 2578 projected.go:194] Error preparing data for projected volume kube-api-access-cmkkr for pod openshift-network-diagnostics/network-check-target-qzgrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:00.080407 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:00.079977 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr podName:bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea nodeName:}" failed. No retries permitted until 2026-04-20 11:42:04.079957407 +0000 UTC m=+10.203376648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmkkr" (UniqueName: "kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr") pod "network-check-target-qzgrd" (UID: "bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:00.331672 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:00.331591 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:00.331834 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:00.331748 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:00.332206 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:00.332184 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:00.332326 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:00.332303 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:02.332657 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:02.332617 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:02.333132 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:02.332774 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:02.333375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:02.333353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:02.333483 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:02.333457 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:04.013646 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:04.013606 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:04.014086 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:04.013771 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:04.014086 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:04.013837 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs podName:d9165296-57f0-4590-ad83-189871356a1a nodeName:}" failed. No retries permitted until 2026-04-20 11:42:12.013817767 +0000 UTC m=+18.137236995 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs") pod "network-metrics-daemon-4lcnh" (UID: "d9165296-57f0-4590-ad83-189871356a1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:04.114557 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:04.114513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkkr\" (UniqueName: \"kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr\") pod \"network-check-target-qzgrd\" (UID: \"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea\") " pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:04.114729 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:04.114700 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:04.114804 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:04.114737 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:04.114804 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:04.114750 2578 projected.go:194] Error preparing data for projected volume kube-api-access-cmkkr for pod openshift-network-diagnostics/network-check-target-qzgrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:04.114908 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:04.114805 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr podName:bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea nodeName:}" failed. No retries permitted until 2026-04-20 11:42:12.114785359 +0000 UTC m=+18.238204601 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmkkr" (UniqueName: "kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr") pod "network-check-target-qzgrd" (UID: "bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:04.332013 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:04.331932 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:04.332013 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:04.331979 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:04.332227 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:04.332088 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:04.332285 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:04.332227 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:04.915979 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:04.915917 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-125.ec2.internal" podStartSLOduration=9.915896517 podStartE2EDuration="9.915896517s" podCreationTimestamp="2026-04-20 11:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:41:59.437853693 +0000 UTC m=+5.561272940" watchObservedRunningTime="2026-04-20 11:42:04.915896517 +0000 UTC m=+11.039315764" Apr 20 11:42:04.919402 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:04.919370 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vn8x4"] Apr 20 11:42:04.925683 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:04.925656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:04.925805 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:04.925752 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:05.023763 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:05.023713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bc1a0b4c-542c-4194-8902-65ea34abd811-kubelet-config\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:05.024269 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:05.023772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:05.024269 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:05.023815 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bc1a0b4c-542c-4194-8902-65ea34abd811-dbus\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:05.124175 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:05.124131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bc1a0b4c-542c-4194-8902-65ea34abd811-kubelet-config\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:05.124371 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:05.124190 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:05.124371 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:05.124217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bc1a0b4c-542c-4194-8902-65ea34abd811-dbus\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:05.124470 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:05.124414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bc1a0b4c-542c-4194-8902-65ea34abd811-dbus\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:05.124470 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:05.124466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bc1a0b4c-542c-4194-8902-65ea34abd811-kubelet-config\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:05.124575 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:05.124558 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:05.124658 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:05.124647 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret podName:bc1a0b4c-542c-4194-8902-65ea34abd811 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:05.624603855 +0000 UTC m=+11.748023087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret") pod "global-pull-secret-syncer-vn8x4" (UID: "bc1a0b4c-542c-4194-8902-65ea34abd811") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:05.627550 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:05.627514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:05.627737 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:05.627645 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:05.627737 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:05.627716 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret podName:bc1a0b4c-542c-4194-8902-65ea34abd811 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:06.627697798 +0000 UTC m=+12.751117023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret") pod "global-pull-secret-syncer-vn8x4" (UID: "bc1a0b4c-542c-4194-8902-65ea34abd811") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:06.332172 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:06.331655 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:06.332172 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:06.331699 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:06.332172 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:06.331769 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:06.332172 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:06.331925 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:06.635041 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:06.634956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:06.635198 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:06.635123 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:06.635256 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:06.635200 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret podName:bc1a0b4c-542c-4194-8902-65ea34abd811 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:08.635184393 +0000 UTC m=+14.758603618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret") pod "global-pull-secret-syncer-vn8x4" (UID: "bc1a0b4c-542c-4194-8902-65ea34abd811") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:07.331110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:07.331075 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:07.331328 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:07.331215 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:08.331378 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:08.331288 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:08.331795 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:08.331422 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:08.331795 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:08.331484 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:08.331795 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:08.331585 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:08.648153 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:08.648056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:08.648816 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:08.648786 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:08.648958 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:08.648916 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret podName:bc1a0b4c-542c-4194-8902-65ea34abd811 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:12.648892185 +0000 UTC m=+18.772311414 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret") pod "global-pull-secret-syncer-vn8x4" (UID: "bc1a0b4c-542c-4194-8902-65ea34abd811") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:09.331682 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:09.331651 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:09.331991 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:09.331753 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:10.333971 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:10.331767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:10.333971 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:10.331906 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:10.333971 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:10.332480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:10.333971 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:10.332575 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:11.331009 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:11.330969 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:11.331189 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:11.331105 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:12.071539 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:12.071504 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:12.071922 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.071668 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:12.071922 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.071746 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs podName:d9165296-57f0-4590-ad83-189871356a1a nodeName:}" failed. No retries permitted until 2026-04-20 11:42:28.07172483 +0000 UTC m=+34.195144068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs") pod "network-metrics-daemon-4lcnh" (UID: "d9165296-57f0-4590-ad83-189871356a1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 11:42:12.172719 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:12.172686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkkr\" (UniqueName: \"kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr\") pod \"network-check-target-qzgrd\" (UID: \"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea\") " pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:12.172945 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.172832 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 11:42:12.172945 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.172849 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 11:42:12.172945 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.172860 2578 projected.go:194] Error preparing data for projected volume kube-api-access-cmkkr for pod openshift-network-diagnostics/network-check-target-qzgrd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:12.172945 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.172911 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr podName:bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea nodeName:}" failed. No retries permitted until 2026-04-20 11:42:28.172897317 +0000 UTC m=+34.296316542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cmkkr" (UniqueName: "kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr") pod "network-check-target-qzgrd" (UID: "bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 11:42:12.331673 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:12.331469 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:12.331673 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:12.331516 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:12.331673 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.331603 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:12.331982 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.331730 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:12.676066 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:12.675973 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:12.676234 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.676128 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:12.676234 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:12.676208 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret podName:bc1a0b4c-542c-4194-8902-65ea34abd811 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:20.676185056 +0000 UTC m=+26.799604285 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret") pod "global-pull-secret-syncer-vn8x4" (UID: "bc1a0b4c-542c-4194-8902-65ea34abd811") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:13.331718 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:13.331675 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:13.332226 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:13.331804 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:14.332502 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.331067 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:14.332502 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:14.331402 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:14.334344 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.333192 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:14.334344 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:14.333327 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:14.445937 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.445908 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ltzgg" event={"ID":"8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d","Type":"ContainerStarted","Data":"cc490f8ccb8274643232d8ab3dbd3dd1f546702a0aea9ed5177e1b3d332ea41f"} Apr 20 11:42:14.450664 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.450633 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s6chk" event={"ID":"f6dd0225-09bd-4349-9632-48a466010b96","Type":"ContainerStarted","Data":"33336a03596d9b0602d16cc7105de76e5dc3b69cd8a58285450e7ba7dc04514b"} Apr 20 11:42:14.451970 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.451947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-spb5n" event={"ID":"6316a3d4-5236-4574-91c5-ccd6e85aee53","Type":"ContainerStarted","Data":"5c246256e0ccc56febbcb4d2488fe247e9919aa929247f7323052f2d107a24ec"} Apr 20 11:42:14.453197 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.453153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" event={"ID":"f6bd444d-179f-465d-9358-90444a0bd1b0","Type":"ContainerStarted","Data":"14b5d73cf951f2df72ea2b4a4da002a1fad789b788da394282105fe637631c9d"} Apr 20 11:42:14.454456 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.454432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nxf6x" event={"ID":"f7fd4abe-2e90-42e4-b4e1-6d43241cd39a","Type":"ContainerStarted","Data":"e86f49f56b770735e665bca818f0b8cba6538b8c64b533cf6af7093e102f7c63"} Apr 20 11:42:14.455767 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.455745 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerStarted","Data":"0e026bcd1526bf72ac0b61d13506e3071e9589ec499a2d5779229e3a1ed8cf30"} Apr 20 11:42:14.465911 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.465878 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ltzgg" podStartSLOduration=3.324801074 podStartE2EDuration="20.465866207s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:41:56.948086855 +0000 UTC m=+3.071506096" lastFinishedPulling="2026-04-20 11:42:14.089151987 +0000 UTC m=+20.212571229" observedRunningTime="2026-04-20 11:42:14.465407129 +0000 UTC m=+20.588826376" watchObservedRunningTime="2026-04-20 11:42:14.465866207 +0000 UTC m=+20.589285453" Apr 20 11:42:14.480764 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.480728 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nxf6x" podStartSLOduration=8.114943639 podStartE2EDuration="20.480714335s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:41:56.951435346 +0000 UTC m=+3.074854572" lastFinishedPulling="2026-04-20 11:42:09.317206041 +0000 UTC m=+15.440625268" observedRunningTime="2026-04-20 11:42:14.479965862 +0000 UTC m=+20.603385109" watchObservedRunningTime="2026-04-20 11:42:14.480714335 +0000 UTC m=+20.604133582" Apr 20 11:42:14.499062 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.499026 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s6chk" podStartSLOduration=3.316979575 podStartE2EDuration="20.499012384s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:41:56.943492157 +0000 UTC m=+3.066911384" lastFinishedPulling="2026-04-20 11:42:14.125524967 +0000 UTC m=+20.248944193" observedRunningTime="2026-04-20 11:42:14.498849378 +0000 UTC m=+20.622268787" watchObservedRunningTime="2026-04-20 11:42:14.499012384 +0000 UTC m=+20.622431630" Apr 20 11:42:14.519857 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.519818 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-spb5n" podStartSLOduration=3.348808194 podStartE2EDuration="20.519804972s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:41:56.944386101 +0000 UTC m=+3.067805327" lastFinishedPulling="2026-04-20 11:42:14.115382874 +0000 UTC m=+20.238802105" observedRunningTime="2026-04-20 11:42:14.519715228 +0000 UTC m=+20.643134472" watchObservedRunningTime="2026-04-20 11:42:14.519804972 +0000 UTC m=+20.643224219" Apr 20 11:42:14.591789 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.591767 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fq5mw"] Apr 20 11:42:14.602689 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.602632 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.605208 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.605186 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 11:42:14.605445 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.605422 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 11:42:14.605559 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.605538 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-v6bf2\"" Apr 20 11:42:14.690481 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.690456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7lt\" (UniqueName: \"kubernetes.io/projected/9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552-kube-api-access-rr7lt\") pod \"node-resolver-fq5mw\" (UID: \"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552\") " pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.690612 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.690494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552-tmp-dir\") pod \"node-resolver-fq5mw\" (UID: \"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552\") " pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.690612 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.690543 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552-hosts-file\") pod \"node-resolver-fq5mw\" (UID: \"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552\") " pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.791918 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.791538 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552-hosts-file\") pod \"node-resolver-fq5mw\" (UID: \"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552\") " pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.791918 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.791590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7lt\" (UniqueName: \"kubernetes.io/projected/9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552-kube-api-access-rr7lt\") pod \"node-resolver-fq5mw\" (UID: \"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552\") " pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.791918 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.791667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552-hosts-file\") pod \"node-resolver-fq5mw\" (UID: \"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552\") " pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.791918 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.791706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552-tmp-dir\") pod \"node-resolver-fq5mw\" (UID: \"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552\") " pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.792384 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.791986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552-tmp-dir\") pod \"node-resolver-fq5mw\" (UID: \"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552\") " pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.803154 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.803123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7lt\" (UniqueName: \"kubernetes.io/projected/9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552-kube-api-access-rr7lt\") pod \"node-resolver-fq5mw\" (UID: \"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552\") " pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.911721 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:14.911598 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fq5mw" Apr 20 11:42:14.921156 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:14.921123 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c5f9f57_fb9e_4b09_a20c_38f4bd5c5552.slice/crio-dbd4028e970c07df4b5ddd4b4f4b49fb292fee927e81ab01a8477a54087896bd WatchSource:0}: Error finding container dbd4028e970c07df4b5ddd4b4f4b49fb292fee927e81ab01a8477a54087896bd: Status 404 returned error can't find the container with id dbd4028e970c07df4b5ddd4b4f4b49fb292fee927e81ab01a8477a54087896bd Apr 20 11:42:15.330975 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.330943 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:15.331112 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:15.331041 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:15.459300 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.459264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fq5mw" event={"ID":"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552","Type":"ContainerStarted","Data":"3e38417095c2ef55b8f8d6d2483fd1e12a64379e785f569fddd81c2dc7d8c73f"} Apr 20 11:42:15.459896 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.459316 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fq5mw" event={"ID":"9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552","Type":"ContainerStarted","Data":"dbd4028e970c07df4b5ddd4b4f4b49fb292fee927e81ab01a8477a54087896bd"} Apr 20 11:42:15.464911 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.464888 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/ovn-acl-logging/0.log" Apr 20 11:42:15.465183 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.465164 2578 generic.go:358] "Generic (PLEG): container finished" podID="6933a359-bd42-4dcd-94d7-cc72b948509c" containerID="3671f08d4903a31fa28aacd9648cf0024bf159da97a9e38ad26aa8823b739169" exitCode=1 Apr 20 11:42:15.465276 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.465217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerStarted","Data":"ae984c5de61d9effe418b88b8259a24973556a161a3e0c68efb46a9a399b786d"} Apr 20 11:42:15.465276 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.465237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerStarted","Data":"fa22922b033855a16e421a682c04da575e5c78e2a111e2a1c4d5ab6a9ef730a8"} Apr 20 11:42:15.465276 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.465272 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerStarted","Data":"4c2e48acb93b00e98745bea8abec141892945c0f3f12dc25b43ab7aedb27aa76"} Apr 20 11:42:15.465431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.465282 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerStarted","Data":"31e0ad474fe3acdff86d4fd0b76d4734bc056bcfa2ec8248fc40fe7962497468"} Apr 20 11:42:15.465431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.465304 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerDied","Data":"3671f08d4903a31fa28aacd9648cf0024bf159da97a9e38ad26aa8823b739169"} Apr 20 11:42:15.466408 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.466385 2578 generic.go:358] "Generic (PLEG): container finished" podID="6276a0c3-e955-4691-b383-18751303b9e2" containerID="4707cc4147b979773edb3dfabfce37a3241e93f04a9df3c360d776a81f7b4de2" exitCode=0 Apr 20 11:42:15.466514 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.466488 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" event={"ID":"6276a0c3-e955-4691-b383-18751303b9e2","Type":"ContainerDied","Data":"4707cc4147b979773edb3dfabfce37a3241e93f04a9df3c360d776a81f7b4de2"} Apr 20 11:42:15.475645 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.475608 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fq5mw" podStartSLOduration=1.475593617 podStartE2EDuration="1.475593617s" podCreationTimestamp="2026-04-20 11:42:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:42:15.474739904 +0000 UTC m=+21.598159170" watchObservedRunningTime="2026-04-20 11:42:15.475593617 +0000 UTC m=+21.599012864" Apr 20 11:42:15.584600 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:15.584424 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 11:42:16.305390 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.305290 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T11:42:15.58459813Z","UUID":"ba53977d-5ce2-4f33-80c9-2b9edde29222","Handler":null,"Name":"","Endpoint":""} Apr 20 11:42:16.307920 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.307893 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 11:42:16.308060 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.307929 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 11:42:16.332337 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.332307 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:16.332492 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:16.332432 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:16.333349 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.333321 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:16.333473 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:16.333432 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:16.470436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.470395 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" event={"ID":"f6bd444d-179f-465d-9358-90444a0bd1b0","Type":"ContainerStarted","Data":"7e7270071e9bf22f88fbc675c693344ebdef7b00c2a052c6e928a8b7771b7710"} Apr 20 11:42:16.471910 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.471875 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s8s8p" event={"ID":"72feb6a6-4564-4f5a-a26a-008d43db43b7","Type":"ContainerStarted","Data":"07470c41eb7e86d33de6bc3e2f9d6ae217dcde6f47b2873fd4f84011cc79d623"} Apr 20 11:42:16.488491 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.488445 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s8s8p" podStartSLOduration=5.318728673 podStartE2EDuration="22.488426754s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:41:56.943397107 +0000 UTC m=+3.066816332" lastFinishedPulling="2026-04-20 11:42:14.113095181 +0000 UTC m=+20.236514413" observedRunningTime="2026-04-20 11:42:16.487940258 +0000 UTC m=+22.611359504" watchObservedRunningTime="2026-04-20 11:42:16.488426754 +0000 UTC m=+22.611846002" Apr 20 11:42:16.944468 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.944444 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:42:16.945019 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:16.944996 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:42:17.331486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:17.331450 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:17.331671 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:17.331564 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:17.475790 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:17.475750 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" event={"ID":"f6bd444d-179f-465d-9358-90444a0bd1b0","Type":"ContainerStarted","Data":"1889e60336bc2facf67457e97c0b5a83fd405f9fc53d00dc73f49234c9bf6cec"} Apr 20 11:42:17.478807 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:17.478785 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/ovn-acl-logging/0.log" Apr 20 11:42:17.479304 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:17.479266 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerStarted","Data":"877a8befc1cf34b3aaf6c291b83ac84024d470d53513ea1f00920c644c755d43"} Apr 20 11:42:17.479562 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:17.479535 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:42:17.480144 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:17.480128 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nxf6x" Apr 20 11:42:17.508598 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:17.508552 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zzv4z" podStartSLOduration=3.720409497 podStartE2EDuration="23.508536039s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:41:56.947050467 +0000 UTC m=+3.070469693" lastFinishedPulling="2026-04-20 11:42:16.735176995 +0000 UTC m=+22.858596235" observedRunningTime="2026-04-20 11:42:17.493862042 +0000 UTC m=+23.617281290" watchObservedRunningTime="2026-04-20 11:42:17.508536039 +0000 UTC m=+23.631955283" Apr 20 11:42:18.331340 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:18.331300 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:18.331340 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:18.331334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:18.331583 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:18.331429 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:18.331583 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:18.331529 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:19.330850 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:19.330771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:19.331347 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:19.330886 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:20.331430 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.331387 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:20.331864 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.331436 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:20.331864 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:20.331531 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:20.331864 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:20.331719 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:20.487363 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.487334 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/ovn-acl-logging/0.log" Apr 20 11:42:20.487655 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.487627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerStarted","Data":"d654d0b13b646a547431a5976d564b6ba480861185c8051e0f94bcadf153581e"} Apr 20 11:42:20.487899 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.487875 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:42:20.488117 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.488100 2578 scope.go:117] "RemoveContainer" containerID="3671f08d4903a31fa28aacd9648cf0024bf159da97a9e38ad26aa8823b739169" Apr 20 11:42:20.490272 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.490227 2578 generic.go:358] "Generic (PLEG): container finished" podID="6276a0c3-e955-4691-b383-18751303b9e2" containerID="988e561507c7d9eb3bea5c303bd15456964adaac7d87c3db5aa63010081a8926" exitCode=0 Apr 20 11:42:20.490355 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.490306 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" event={"ID":"6276a0c3-e955-4691-b383-18751303b9e2","Type":"ContainerDied","Data":"988e561507c7d9eb3bea5c303bd15456964adaac7d87c3db5aa63010081a8926"} Apr 20 11:42:20.503664 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.503633 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:42:20.737120 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:20.737017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:20.737304 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:20.737219 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:20.737362 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:20.737315 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret podName:bc1a0b4c-542c-4194-8902-65ea34abd811 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:36.737293854 +0000 UTC m=+42.860713084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret") pod "global-pull-secret-syncer-vn8x4" (UID: "bc1a0b4c-542c-4194-8902-65ea34abd811") : object "kube-system"/"original-pull-secret" not registered Apr 20 11:42:21.331589 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.331554 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:21.332110 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:21.331675 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:21.405943 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.405866 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vn8x4"] Apr 20 11:42:21.409297 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.409269 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qzgrd"] Apr 20 11:42:21.409439 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.409390 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:21.409560 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:21.409483 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:21.409923 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.409901 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4lcnh"] Apr 20 11:42:21.410041 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.410021 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:21.410163 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:21.410140 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:21.495309 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.495281 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/ovn-acl-logging/0.log" Apr 20 11:42:21.495664 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.495644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" event={"ID":"6933a359-bd42-4dcd-94d7-cc72b948509c","Type":"ContainerStarted","Data":"8f4e369645eff77bed9688b05db4c134385b9ee21d8562b2fde1b341f1fac0d4"} Apr 20 11:42:21.495771 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.495757 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 11:42:21.496041 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.496011 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:42:21.497675 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.497652 2578 generic.go:358] "Generic (PLEG): container finished" podID="6276a0c3-e955-4691-b383-18751303b9e2" containerID="c74bb8dd7455f88b26c9acbff85ba73f5c9071fa8b4ac449505bb99aa29810a4" exitCode=0 Apr 20 11:42:21.497779 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.497691 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" event={"ID":"6276a0c3-e955-4691-b383-18751303b9e2","Type":"ContainerDied","Data":"c74bb8dd7455f88b26c9acbff85ba73f5c9071fa8b4ac449505bb99aa29810a4"} Apr 20 11:42:21.497779 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.497728 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:21.497925 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:21.497908 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:21.511388 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.511360 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:42:21.523189 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:21.523147 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" podStartSLOduration=10.315625117 podStartE2EDuration="27.523134544s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:41:56.950719901 +0000 UTC m=+3.074139128" lastFinishedPulling="2026-04-20 11:42:14.158229323 +0000 UTC m=+20.281648555" observedRunningTime="2026-04-20 11:42:21.52154661 +0000 UTC m=+27.644965856" watchObservedRunningTime="2026-04-20 11:42:21.523134544 +0000 UTC m=+27.646553843" Apr 20 11:42:22.501686 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:22.501591 2578 generic.go:358] "Generic (PLEG): container finished" podID="6276a0c3-e955-4691-b383-18751303b9e2" containerID="45ea16f8ca358d292a13123f7985ea1cff2018a9faf65df68c0884adc6d5783e" exitCode=0 Apr 20 11:42:22.502215 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:22.501678 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" event={"ID":"6276a0c3-e955-4691-b383-18751303b9e2","Type":"ContainerDied","Data":"45ea16f8ca358d292a13123f7985ea1cff2018a9faf65df68c0884adc6d5783e"} Apr 20 11:42:22.502215 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:22.501852 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 11:42:23.331196 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:23.331162 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:23.331196 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:23.331190 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:23.331431 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:23.331226 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:23.331431 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:23.331369 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:23.331535 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:23.331429 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:23.331581 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:23.331541 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:23.509886 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:23.508898 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 11:42:25.331473 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:25.331426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:25.331473 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:25.331455 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:25.331917 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:25.331426 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:25.331917 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:25.331551 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qzgrd" podUID="bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea" Apr 20 11:42:25.331917 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:25.331621 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4lcnh" podUID="d9165296-57f0-4590-ad83-189871356a1a" Apr 20 11:42:25.331917 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:25.331717 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vn8x4" podUID="bc1a0b4c-542c-4194-8902-65ea34abd811" Apr 20 11:42:25.442333 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:25.442295 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:42:25.442617 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:25.442601 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 11:42:25.455125 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:25.455090 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8q26d" Apr 20 11:42:26.201408 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.201380 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-125.ec2.internal" event="NodeReady" Apr 20 11:42:26.201591 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.201544 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 11:42:26.242105 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.242034 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b66586b66-hghv5"] Apr 20 11:42:26.263661 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.263628 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lvlmv"] Apr 20 11:42:26.263843 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.263820 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.268048 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.267838 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 11:42:26.268181 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.268167 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jqbrx\"" Apr 20 11:42:26.268457 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.268425 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 11:42:26.271327 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.271254 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 11:42:26.280334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.280306 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 11:42:26.281940 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.281915 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8sqkc"] Apr 20 11:42:26.282116 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.282096 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.284654 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.284635 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 11:42:26.284768 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.284747 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 11:42:26.284877 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.284700 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5crpn\"" Apr 20 11:42:26.304491 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.304462 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b66586b66-hghv5"] Apr 20 11:42:26.304491 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.304491 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8sqkc"] Apr 20 11:42:26.304491 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.304500 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lvlmv"] Apr 20 11:42:26.304736 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.304627 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:26.308297 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.308265 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 11:42:26.308410 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.308324 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 11:42:26.308560 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.308542 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-twcdr\"" Apr 20 11:42:26.308642 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.308598 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 11:42:26.381492 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.381492 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381493 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-bound-sa-token\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p2tm\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-kube-api-access-2p2tm\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/53d403af-81c7-4e78-8fe5-d31a6f123b4b-tmp-dir\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53d403af-81c7-4e78-8fe5-d31a6f123b4b-config-volume\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381672 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8796d702-f320-49e1-9033-817a78763256-ca-trust-extracted\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-installation-pull-secrets\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381799 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-image-registry-private-configuration\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381824 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-trusted-ca\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfbsb\" (UniqueName: \"kubernetes.io/projected/53d403af-81c7-4e78-8fe5-d31a6f123b4b-kube-api-access-bfbsb\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381943 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-registry-certificates\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.381976 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:26.382410 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.381998 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn7jz\" (UniqueName: \"kubernetes.io/projected/e3fe8f50-9343-4a7e-8938-2d2334926942-kube-api-access-rn7jz\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:26.482793 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.482755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-bound-sa-token\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.482793 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.482800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p2tm\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-kube-api-access-2p2tm\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.483032 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.482832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/53d403af-81c7-4e78-8fe5-d31a6f123b4b-tmp-dir\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.483032 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.482875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53d403af-81c7-4e78-8fe5-d31a6f123b4b-config-volume\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.483032 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.482898 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.483032 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.482914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8796d702-f320-49e1-9033-817a78763256-ca-trust-extracted\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.483032 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.482936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-installation-pull-secrets\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.483032 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.482962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-image-registry-private-configuration\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.483032 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.482982 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-trusted-ca\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.483372 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.483037 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfbsb\" (UniqueName: \"kubernetes.io/projected/53d403af-81c7-4e78-8fe5-d31a6f123b4b-kube-api-access-bfbsb\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.483372 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.483100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-registry-certificates\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.483372 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.483131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:26.483372 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.483156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn7jz\" (UniqueName: \"kubernetes.io/projected/e3fe8f50-9343-4a7e-8938-2d2334926942-kube-api-access-rn7jz\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:26.483372 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.483185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.483372 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.483349 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:26.483563 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.483417 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls podName:53d403af-81c7-4e78-8fe5-d31a6f123b4b nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.983395553 +0000 UTC m=+33.106814785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls") pod "dns-default-lvlmv" (UID: "53d403af-81c7-4e78-8fe5-d31a6f123b4b") : secret "dns-default-metrics-tls" not found Apr 20 11:42:26.483876 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.483848 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:26.484007 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.483912 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert podName:e3fe8f50-9343-4a7e-8938-2d2334926942 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.983892882 +0000 UTC m=+33.107312110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert") pod "ingress-canary-8sqkc" (UID: "e3fe8f50-9343-4a7e-8938-2d2334926942") : secret "canary-serving-cert" not found Apr 20 11:42:26.484233 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.484211 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/53d403af-81c7-4e78-8fe5-d31a6f123b4b-tmp-dir\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.484515 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.484487 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:26.484515 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.484512 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b66586b66-hghv5: secret "image-registry-tls" not found Apr 20 11:42:26.484661 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.484586 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls podName:8796d702-f320-49e1-9033-817a78763256 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:26.984569514 +0000 UTC m=+33.107988756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls") pod "image-registry-b66586b66-hghv5" (UID: "8796d702-f320-49e1-9033-817a78763256") : secret "image-registry-tls" not found Apr 20 11:42:26.484705 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.484672 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53d403af-81c7-4e78-8fe5-d31a6f123b4b-config-volume\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.494136 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.494050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-registry-certificates\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.495382 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.495356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-trusted-ca\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.495572 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.495526 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8796d702-f320-49e1-9033-817a78763256-ca-trust-extracted\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.498285 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.498257 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-bound-sa-token\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.498409 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.498269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfbsb\" (UniqueName: \"kubernetes.io/projected/53d403af-81c7-4e78-8fe5-d31a6f123b4b-kube-api-access-bfbsb\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.498409 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.498382 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn7jz\" (UniqueName: \"kubernetes.io/projected/e3fe8f50-9343-4a7e-8938-2d2334926942-kube-api-access-rn7jz\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:26.498730 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.498713 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p2tm\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-kube-api-access-2p2tm\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.498816 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.498795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-image-registry-private-configuration\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.499379 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.499359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-installation-pull-secrets\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.987211 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.987166 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:26.987439 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.987297 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:26.987439 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:26.987321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:26.987439 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.987354 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:26.987439 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.987375 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b66586b66-hghv5: secret "image-registry-tls" not found Apr 20 11:42:26.987439 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.987414 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:26.987680 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.987458 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls podName:8796d702-f320-49e1-9033-817a78763256 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:27.987434282 +0000 UTC m=+34.110853508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls") pod "image-registry-b66586b66-hghv5" (UID: "8796d702-f320-49e1-9033-817a78763256") : secret "image-registry-tls" not found Apr 20 11:42:26.987680 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.987479 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls podName:53d403af-81c7-4e78-8fe5-d31a6f123b4b nodeName:}" failed. No retries permitted until 2026-04-20 11:42:27.987468474 +0000 UTC m=+34.110887701 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls") pod "dns-default-lvlmv" (UID: "53d403af-81c7-4e78-8fe5-d31a6f123b4b") : secret "dns-default-metrics-tls" not found Apr 20 11:42:26.987680 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.987498 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:26.987680 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:26.987533 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert podName:e3fe8f50-9343-4a7e-8938-2d2334926942 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:27.987516743 +0000 UTC m=+34.110935968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert") pod "ingress-canary-8sqkc" (UID: "e3fe8f50-9343-4a7e-8938-2d2334926942") : secret "canary-serving-cert" not found Apr 20 11:42:27.331402 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.331363 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:27.331402 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.331390 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:27.331822 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.331369 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:27.335773 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.335753 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-dmnrq\"" Apr 20 11:42:27.335928 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.335777 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 11:42:27.335928 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.335803 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 11:42:27.335928 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.335839 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 11:42:27.336091 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.336042 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 11:42:27.336091 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.336081 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lppl4\"" Apr 20 11:42:27.997285 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.997204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:27.997285 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.997279 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:27.997681 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:27.997342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:27.997681 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:27.997387 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:27.997681 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:27.997462 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert podName:e3fe8f50-9343-4a7e-8938-2d2334926942 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:29.997441564 +0000 UTC m=+36.120860794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert") pod "ingress-canary-8sqkc" (UID: "e3fe8f50-9343-4a7e-8938-2d2334926942") : secret "canary-serving-cert" not found Apr 20 11:42:27.997681 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:27.997466 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:27.997681 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:27.997483 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b66586b66-hghv5: secret "image-registry-tls" not found Apr 20 11:42:27.997681 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:27.997505 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:27.997681 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:27.997518 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls podName:8796d702-f320-49e1-9033-817a78763256 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:29.997507091 +0000 UTC m=+36.120926318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls") pod "image-registry-b66586b66-hghv5" (UID: "8796d702-f320-49e1-9033-817a78763256") : secret "image-registry-tls" not found Apr 20 11:42:27.997681 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:27.997542 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls podName:53d403af-81c7-4e78-8fe5-d31a6f123b4b nodeName:}" failed. No retries permitted until 2026-04-20 11:42:29.997531094 +0000 UTC m=+36.120950318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls") pod "dns-default-lvlmv" (UID: "53d403af-81c7-4e78-8fe5-d31a6f123b4b") : secret "dns-default-metrics-tls" not found Apr 20 11:42:28.098680 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:28.098649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:42:28.098829 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:28.098792 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 11:42:28.098899 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:28.098880 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs podName:d9165296-57f0-4590-ad83-189871356a1a nodeName:}" failed. No retries permitted until 2026-04-20 11:43:00.098856713 +0000 UTC m=+66.222275944 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs") pod "network-metrics-daemon-4lcnh" (UID: "d9165296-57f0-4590-ad83-189871356a1a") : secret "metrics-daemon-secret" not found Apr 20 11:42:28.199696 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:28.199669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkkr\" (UniqueName: \"kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr\") pod \"network-check-target-qzgrd\" (UID: \"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea\") " pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:28.202425 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:28.202401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmkkr\" (UniqueName: \"kubernetes.io/projected/bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea-kube-api-access-cmkkr\") pod \"network-check-target-qzgrd\" (UID: \"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea\") " pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:28.254562 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:28.254527 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:28.424190 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:28.424114 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qzgrd"] Apr 20 11:42:28.427188 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:28.427139 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc2ca7e3_e1d9_4855_90b5_3eb77ac6efea.slice/crio-cece8643a28afe86ba726ec10631d9c1b7f86a3c3815e887dd4b4ee60665ab93 WatchSource:0}: Error finding container cece8643a28afe86ba726ec10631d9c1b7f86a3c3815e887dd4b4ee60665ab93: Status 404 returned error can't find the container with id cece8643a28afe86ba726ec10631d9c1b7f86a3c3815e887dd4b4ee60665ab93 Apr 20 11:42:28.520822 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:28.520785 2578 generic.go:358] "Generic (PLEG): container finished" podID="6276a0c3-e955-4691-b383-18751303b9e2" containerID="635e8d94d4f26090fd8f1237873bcbb77159c8ca76d53dbc1d77ec7d2996f0bb" exitCode=0 Apr 20 11:42:28.520994 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:28.520856 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" event={"ID":"6276a0c3-e955-4691-b383-18751303b9e2","Type":"ContainerDied","Data":"635e8d94d4f26090fd8f1237873bcbb77159c8ca76d53dbc1d77ec7d2996f0bb"} Apr 20 11:42:28.521948 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:28.521918 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qzgrd" event={"ID":"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea","Type":"ContainerStarted","Data":"cece8643a28afe86ba726ec10631d9c1b7f86a3c3815e887dd4b4ee60665ab93"} Apr 20 11:42:29.526974 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:29.526791 2578 generic.go:358] "Generic (PLEG): container finished" podID="6276a0c3-e955-4691-b383-18751303b9e2" containerID="81a680a785400ce077726e8a8bd6db79e4246d3cd2b5439d9feff2288f0e9ebf" exitCode=0 Apr 20 11:42:29.527685 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:29.526874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" event={"ID":"6276a0c3-e955-4691-b383-18751303b9e2","Type":"ContainerDied","Data":"81a680a785400ce077726e8a8bd6db79e4246d3cd2b5439d9feff2288f0e9ebf"} Apr 20 11:42:30.014725 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:30.014695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:30.014907 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:30.014743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:30.014907 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:30.014815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:30.014907 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:30.014871 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:30.015043 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:30.014910 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:30.015043 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:30.014957 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert podName:e3fe8f50-9343-4a7e-8938-2d2334926942 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:34.01493406 +0000 UTC m=+40.138353302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert") pod "ingress-canary-8sqkc" (UID: "e3fe8f50-9343-4a7e-8938-2d2334926942") : secret "canary-serving-cert" not found Apr 20 11:42:30.015043 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:30.014975 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:30.015043 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:30.014987 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b66586b66-hghv5: secret "image-registry-tls" not found Apr 20 11:42:30.015043 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:30.014976 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls podName:53d403af-81c7-4e78-8fe5-d31a6f123b4b nodeName:}" failed. No retries permitted until 2026-04-20 11:42:34.014965954 +0000 UTC m=+40.138385178 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls") pod "dns-default-lvlmv" (UID: "53d403af-81c7-4e78-8fe5-d31a6f123b4b") : secret "dns-default-metrics-tls" not found Apr 20 11:42:30.015043 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:30.015043 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls podName:8796d702-f320-49e1-9033-817a78763256 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:34.015030828 +0000 UTC m=+40.138450053 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls") pod "image-registry-b66586b66-hghv5" (UID: "8796d702-f320-49e1-9033-817a78763256") : secret "image-registry-tls" not found Apr 20 11:42:30.532774 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:30.532721 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" event={"ID":"6276a0c3-e955-4691-b383-18751303b9e2","Type":"ContainerStarted","Data":"e833142fb443d8346984ad0429690d8636afced8f3ecef24f8b497d678b989d2"} Apr 20 11:42:30.571352 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:30.571295 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vg6zv" podStartSLOduration=5.55979961 podStartE2EDuration="36.571277161s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:41:56.949451396 +0000 UTC m=+3.072870622" lastFinishedPulling="2026-04-20 11:42:27.960928945 +0000 UTC m=+34.084348173" observedRunningTime="2026-04-20 11:42:30.570746377 +0000 UTC m=+36.694165640" watchObservedRunningTime="2026-04-20 11:42:30.571277161 +0000 UTC m=+36.694696411" Apr 20 11:42:32.539632 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:32.539595 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qzgrd" event={"ID":"bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea","Type":"ContainerStarted","Data":"f2dedc8188bd3512261c84443501c8f752e34b02175a02032016cbb4bbc0097d"} Apr 20 11:42:32.540057 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:32.539766 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:42:34.046906 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:34.046845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:34.046906 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:34.046909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:34.047456 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:34.046948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:34.047456 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:34.047005 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:34.047456 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:34.047067 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:34.047456 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:34.047077 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert podName:e3fe8f50-9343-4a7e-8938-2d2334926942 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:42.047059852 +0000 UTC m=+48.170479081 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert") pod "ingress-canary-8sqkc" (UID: "e3fe8f50-9343-4a7e-8938-2d2334926942") : secret "canary-serving-cert" not found Apr 20 11:42:34.047456 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:34.047078 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:34.047456 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:34.047083 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b66586b66-hghv5: secret "image-registry-tls" not found Apr 20 11:42:34.047456 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:34.047403 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls podName:53d403af-81c7-4e78-8fe5-d31a6f123b4b nodeName:}" failed. No retries permitted until 2026-04-20 11:42:42.047115799 +0000 UTC m=+48.170535024 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls") pod "dns-default-lvlmv" (UID: "53d403af-81c7-4e78-8fe5-d31a6f123b4b") : secret "dns-default-metrics-tls" not found Apr 20 11:42:34.047771 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:34.047597 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls podName:8796d702-f320-49e1-9033-817a78763256 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:42.047575705 +0000 UTC m=+48.170994933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls") pod "image-registry-b66586b66-hghv5" (UID: "8796d702-f320-49e1-9033-817a78763256") : secret "image-registry-tls" not found Apr 20 11:42:36.764371 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:36.764318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:36.767911 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:36.767880 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bc1a0b4c-542c-4194-8902-65ea34abd811-original-pull-secret\") pod \"global-pull-secret-syncer-vn8x4\" (UID: \"bc1a0b4c-542c-4194-8902-65ea34abd811\") " pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:36.943788 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:36.943740 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vn8x4" Apr 20 11:42:37.061597 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.061526 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qzgrd" podStartSLOduration=40.046242323 podStartE2EDuration="43.061506708s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:42:28.429060105 +0000 UTC m=+34.552479330" lastFinishedPulling="2026-04-20 11:42:31.444324486 +0000 UTC m=+37.567743715" observedRunningTime="2026-04-20 11:42:32.563969416 +0000 UTC m=+38.687388662" watchObservedRunningTime="2026-04-20 11:42:37.061506708 +0000 UTC m=+43.184925952" Apr 20 11:42:37.062430 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.062409 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vn8x4"] Apr 20 11:42:37.065453 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:37.065431 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1a0b4c_542c_4194_8902_65ea34abd811.slice/crio-9cdd1b41bd5882a74aff03673b2c6af5590886d112b1a8c7f5c7ce0fc204bc9e WatchSource:0}: Error finding container 9cdd1b41bd5882a74aff03673b2c6af5590886d112b1a8c7f5c7ce0fc204bc9e: Status 404 returned error can't find the container with id 9cdd1b41bd5882a74aff03673b2c6af5590886d112b1a8c7f5c7ce0fc204bc9e Apr 20 11:42:37.551113 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.551077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vn8x4" event={"ID":"bc1a0b4c-542c-4194-8902-65ea34abd811","Type":"ContainerStarted","Data":"9cdd1b41bd5882a74aff03673b2c6af5590886d112b1a8c7f5c7ce0fc204bc9e"} Apr 20 11:42:37.677449 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.677354 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96"] Apr 20 11:42:37.680856 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.680829 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.685016 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.684955 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 11:42:37.685148 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.685026 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 11:42:37.686419 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.686401 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 11:42:37.686553 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.686452 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 11:42:37.691941 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.691917 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96"] Apr 20 11:42:37.773453 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.773419 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f62d728-f592-473e-8116-2b4ae5d6f3dc-tmp\") pod \"klusterlet-addon-workmgr-9cf9fbd89-mdx96\" (UID: \"2f62d728-f592-473e-8116-2b4ae5d6f3dc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.773873 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.773466 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2f62d728-f592-473e-8116-2b4ae5d6f3dc-klusterlet-config\") pod \"klusterlet-addon-workmgr-9cf9fbd89-mdx96\" (UID: \"2f62d728-f592-473e-8116-2b4ae5d6f3dc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.773873 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.773529 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4b7j\" (UniqueName: \"kubernetes.io/projected/2f62d728-f592-473e-8116-2b4ae5d6f3dc-kube-api-access-d4b7j\") pod \"klusterlet-addon-workmgr-9cf9fbd89-mdx96\" (UID: \"2f62d728-f592-473e-8116-2b4ae5d6f3dc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.874440 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.874357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f62d728-f592-473e-8116-2b4ae5d6f3dc-tmp\") pod \"klusterlet-addon-workmgr-9cf9fbd89-mdx96\" (UID: \"2f62d728-f592-473e-8116-2b4ae5d6f3dc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.874440 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.874404 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2f62d728-f592-473e-8116-2b4ae5d6f3dc-klusterlet-config\") pod \"klusterlet-addon-workmgr-9cf9fbd89-mdx96\" (UID: \"2f62d728-f592-473e-8116-2b4ae5d6f3dc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.874664 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.874478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4b7j\" (UniqueName: \"kubernetes.io/projected/2f62d728-f592-473e-8116-2b4ae5d6f3dc-kube-api-access-d4b7j\") pod \"klusterlet-addon-workmgr-9cf9fbd89-mdx96\" (UID: \"2f62d728-f592-473e-8116-2b4ae5d6f3dc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.874861 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.874807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2f62d728-f592-473e-8116-2b4ae5d6f3dc-tmp\") pod \"klusterlet-addon-workmgr-9cf9fbd89-mdx96\" (UID: \"2f62d728-f592-473e-8116-2b4ae5d6f3dc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.877432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.877407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2f62d728-f592-473e-8116-2b4ae5d6f3dc-klusterlet-config\") pod \"klusterlet-addon-workmgr-9cf9fbd89-mdx96\" (UID: \"2f62d728-f592-473e-8116-2b4ae5d6f3dc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.884321 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.884295 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4b7j\" (UniqueName: \"kubernetes.io/projected/2f62d728-f592-473e-8116-2b4ae5d6f3dc-kube-api-access-d4b7j\") pod \"klusterlet-addon-workmgr-9cf9fbd89-mdx96\" (UID: \"2f62d728-f592-473e-8116-2b4ae5d6f3dc\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:37.992738 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:37.992699 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:38.130252 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:38.130149 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96"] Apr 20 11:42:38.134042 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:38.134000 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f62d728_f592_473e_8116_2b4ae5d6f3dc.slice/crio-fecc95428b7d66120d484ec9c330dbaab28d20b5e7dc3843b6b0447c4fe45b3a WatchSource:0}: Error finding container fecc95428b7d66120d484ec9c330dbaab28d20b5e7dc3843b6b0447c4fe45b3a: Status 404 returned error can't find the container with id fecc95428b7d66120d484ec9c330dbaab28d20b5e7dc3843b6b0447c4fe45b3a Apr 20 11:42:38.555106 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:38.555066 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" event={"ID":"2f62d728-f592-473e-8116-2b4ae5d6f3dc","Type":"ContainerStarted","Data":"fecc95428b7d66120d484ec9c330dbaab28d20b5e7dc3843b6b0447c4fe45b3a"} Apr 20 11:42:42.107894 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:42.107857 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:42.107894 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:42.107900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:42.108408 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:42.108002 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 11:42:42.108408 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:42.108035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:42.108408 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:42.108001 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 11:42:42.108408 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:42.108101 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 11:42:42.108408 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:42.108111 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b66586b66-hghv5: secret "image-registry-tls" not found Apr 20 11:42:42.108408 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:42.108064 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls podName:53d403af-81c7-4e78-8fe5-d31a6f123b4b nodeName:}" failed. No retries permitted until 2026-04-20 11:42:58.108049555 +0000 UTC m=+64.231468779 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls") pod "dns-default-lvlmv" (UID: "53d403af-81c7-4e78-8fe5-d31a6f123b4b") : secret "dns-default-metrics-tls" not found Apr 20 11:42:42.108408 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:42.108155 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert podName:e3fe8f50-9343-4a7e-8938-2d2334926942 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:58.108134385 +0000 UTC m=+64.231553621 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert") pod "ingress-canary-8sqkc" (UID: "e3fe8f50-9343-4a7e-8938-2d2334926942") : secret "canary-serving-cert" not found Apr 20 11:42:42.108408 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:42.108182 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls podName:8796d702-f320-49e1-9033-817a78763256 nodeName:}" failed. No retries permitted until 2026-04-20 11:42:58.10817145 +0000 UTC m=+64.231590680 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls") pod "image-registry-b66586b66-hghv5" (UID: "8796d702-f320-49e1-9033-817a78763256") : secret "image-registry-tls" not found Apr 20 11:42:42.565403 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:42.565361 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vn8x4" event={"ID":"bc1a0b4c-542c-4194-8902-65ea34abd811","Type":"ContainerStarted","Data":"987ea1189babfb5755ce318055280188c6ddb2ce026717c62fcb4b265924a201"} Apr 20 11:42:44.363390 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:44.363333 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vn8x4" podStartSLOduration=35.766457114 podStartE2EDuration="40.363316038s" podCreationTimestamp="2026-04-20 11:42:04 +0000 UTC" firstStartedPulling="2026-04-20 11:42:37.067586138 +0000 UTC m=+43.191005366" lastFinishedPulling="2026-04-20 11:42:41.664445053 +0000 UTC m=+47.787864290" observedRunningTime="2026-04-20 11:42:42.59232308 +0000 UTC m=+48.715742327" watchObservedRunningTime="2026-04-20 11:42:44.363316038 +0000 UTC m=+50.486735285" Apr 20 11:42:48.579195 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:48.579153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" event={"ID":"2f62d728-f592-473e-8116-2b4ae5d6f3dc","Type":"ContainerStarted","Data":"df74ea839154a9098c9ec457158d9263724c5aade0a5b6e1fb9d659e9fc54141"} Apr 20 11:42:48.599914 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:48.599864 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" podStartSLOduration=1.296204755 podStartE2EDuration="11.599849742s" podCreationTimestamp="2026-04-20 11:42:37 +0000 UTC" firstStartedPulling="2026-04-20 11:42:38.1360919 +0000 UTC m=+44.259511128" lastFinishedPulling="2026-04-20 11:42:48.439736891 +0000 UTC m=+54.563156115" observedRunningTime="2026-04-20 11:42:48.598994643 +0000 UTC m=+54.722413891" watchObservedRunningTime="2026-04-20 11:42:48.599849742 +0000 UTC m=+54.723268993" Apr 20 11:42:49.581545 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:49.581510 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:49.583047 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:49.583026 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-9cf9fbd89-mdx96" Apr 20 11:42:57.799222 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.799180 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg"] Apr 20 11:42:57.803766 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.803744 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg" Apr 20 11:42:57.806471 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.806450 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:42:57.806587 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.806569 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 11:42:57.807873 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.807856 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-spdkm\"" Apr 20 11:42:57.814967 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.814942 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg"] Apr 20 11:42:57.829274 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.829227 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtmp5\" (UniqueName: \"kubernetes.io/projected/904766d7-451c-4378-8dbf-8486dbbd70e6-kube-api-access-dtmp5\") pod \"volume-data-source-validator-7c6cbb6c87-7qrlg\" (UID: \"904766d7-451c-4378-8dbf-8486dbbd70e6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg" Apr 20 11:42:57.905441 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.905404 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp"] Apr 20 11:42:57.908658 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.908640 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:57.911822 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.911797 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 11:42:57.911995 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.911844 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 11:42:57.912145 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.912010 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:42:57.912268 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.912228 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-9h29z\"" Apr 20 11:42:57.912845 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.912827 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 11:42:57.914018 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.913994 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4mhc4"] Apr 20 11:42:57.917283 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.917265 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-69794ff49d-n7xjz"] Apr 20 11:42:57.917464 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.917446 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:57.920110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.920095 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-vqnvr\"" Apr 20 11:42:57.920348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.920331 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:57.921031 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.921008 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 11:42:57.921983 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.921961 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp"] Apr 20 11:42:57.923192 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.923170 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9dn64\"" Apr 20 11:42:57.923307 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.923170 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 11:42:57.923307 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.923174 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 11:42:57.923566 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.923540 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 11:42:57.923678 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.923642 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 11:42:57.923745 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.923681 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 11:42:57.924731 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.924712 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 11:42:57.924944 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.924929 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 11:42:57.925064 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.925044 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 11:42:57.929492 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.929473 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 11:42:57.929761 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.929690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtmp5\" (UniqueName: \"kubernetes.io/projected/904766d7-451c-4378-8dbf-8486dbbd70e6-kube-api-access-dtmp5\") pod \"volume-data-source-validator-7c6cbb6c87-7qrlg\" (UID: \"904766d7-451c-4378-8dbf-8486dbbd70e6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg" Apr 20 11:42:57.933059 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.933038 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 11:42:57.939152 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.939126 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b66586b66-hghv5"] Apr 20 11:42:57.939346 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:42:57.939327 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-b66586b66-hghv5" podUID="8796d702-f320-49e1-9033-817a78763256" Apr 20 11:42:57.939802 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.939786 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4mhc4"] Apr 20 11:42:57.961518 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.961484 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69794ff49d-n7xjz"] Apr 20 11:42:57.964623 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:57.964592 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtmp5\" (UniqueName: \"kubernetes.io/projected/904766d7-451c-4378-8dbf-8486dbbd70e6-kube-api-access-dtmp5\") pod \"volume-data-source-validator-7c6cbb6c87-7qrlg\" (UID: \"904766d7-451c-4378-8dbf-8486dbbd70e6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg" Apr 20 11:42:58.030408 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/192a48bd-bcb6-4fec-9fa3-cd24f83284be-metrics-certs\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.030408 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-serving-cert\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.030608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b624b2-8718-4b1f-9f76-0459cb6d4184-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-khfpp\" (UID: \"f7b624b2-8718-4b1f-9f76-0459cb6d4184\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.030608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030460 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlqxg\" (UniqueName: \"kubernetes.io/projected/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-kube-api-access-rlqxg\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.030608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030475 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/192a48bd-bcb6-4fec-9fa3-cd24f83284be-service-ca-bundle\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.030608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030502 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t24zt\" (UniqueName: \"kubernetes.io/projected/f7b624b2-8718-4b1f-9f76-0459cb6d4184-kube-api-access-t24zt\") pod \"kube-storage-version-migrator-operator-6769c5d45-khfpp\" (UID: \"f7b624b2-8718-4b1f-9f76-0459cb6d4184\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.030608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b624b2-8718-4b1f-9f76-0459cb6d4184-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-khfpp\" (UID: \"f7b624b2-8718-4b1f-9f76-0459cb6d4184\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.030782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/192a48bd-bcb6-4fec-9fa3-cd24f83284be-default-certificate\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.030782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/192a48bd-bcb6-4fec-9fa3-cd24f83284be-stats-auth\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.030782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030674 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-config\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.030782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030693 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6x2\" (UniqueName: \"kubernetes.io/projected/192a48bd-bcb6-4fec-9fa3-cd24f83284be-kube-api-access-qt6x2\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.030782 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.030720 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-trusted-ca\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.113137 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.113044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg" Apr 20 11:42:58.131535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/192a48bd-bcb6-4fec-9fa3-cd24f83284be-metrics-certs\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.131666 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131555 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-serving-cert\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.131666 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:58.131666 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b624b2-8718-4b1f-9f76-0459cb6d4184-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-khfpp\" (UID: \"f7b624b2-8718-4b1f-9f76-0459cb6d4184\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.131666 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlqxg\" (UniqueName: \"kubernetes.io/projected/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-kube-api-access-rlqxg\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.131876 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/192a48bd-bcb6-4fec-9fa3-cd24f83284be-service-ca-bundle\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.131876 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:58.131876 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t24zt\" (UniqueName: \"kubernetes.io/projected/f7b624b2-8718-4b1f-9f76-0459cb6d4184-kube-api-access-t24zt\") pod \"kube-storage-version-migrator-operator-6769c5d45-khfpp\" (UID: \"f7b624b2-8718-4b1f-9f76-0459cb6d4184\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.131876 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131773 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b624b2-8718-4b1f-9f76-0459cb6d4184-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-khfpp\" (UID: \"f7b624b2-8718-4b1f-9f76-0459cb6d4184\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.131876 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/192a48bd-bcb6-4fec-9fa3-cd24f83284be-default-certificate\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.131876 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/192a48bd-bcb6-4fec-9fa3-cd24f83284be-stats-auth\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.131876 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:58.132253 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-config\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.132253 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6x2\" (UniqueName: \"kubernetes.io/projected/192a48bd-bcb6-4fec-9fa3-cd24f83284be-kube-api-access-qt6x2\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.132253 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.131960 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-trusted-ca\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.132934 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.132656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/192a48bd-bcb6-4fec-9fa3-cd24f83284be-service-ca-bundle\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.133683 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.133204 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-trusted-ca\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.133683 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.133628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7b624b2-8718-4b1f-9f76-0459cb6d4184-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-khfpp\" (UID: \"f7b624b2-8718-4b1f-9f76-0459cb6d4184\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.134002 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.133864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-config\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.134930 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.134903 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-serving-cert\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.135081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.134979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7b624b2-8718-4b1f-9f76-0459cb6d4184-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-khfpp\" (UID: \"f7b624b2-8718-4b1f-9f76-0459cb6d4184\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.135462 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.135434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53d403af-81c7-4e78-8fe5-d31a6f123b4b-metrics-tls\") pod \"dns-default-lvlmv\" (UID: \"53d403af-81c7-4e78-8fe5-d31a6f123b4b\") " pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:58.135960 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.135919 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fe8f50-9343-4a7e-8938-2d2334926942-cert\") pod \"ingress-canary-8sqkc\" (UID: \"e3fe8f50-9343-4a7e-8938-2d2334926942\") " pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:58.136633 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.136290 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/192a48bd-bcb6-4fec-9fa3-cd24f83284be-default-certificate\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.137258 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.137209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"image-registry-b66586b66-hghv5\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:58.137445 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.137430 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/192a48bd-bcb6-4fec-9fa3-cd24f83284be-stats-auth\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.137627 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.137609 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/192a48bd-bcb6-4fec-9fa3-cd24f83284be-metrics-certs\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.154226 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.154203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t24zt\" (UniqueName: \"kubernetes.io/projected/f7b624b2-8718-4b1f-9f76-0459cb6d4184-kube-api-access-t24zt\") pod \"kube-storage-version-migrator-operator-6769c5d45-khfpp\" (UID: \"f7b624b2-8718-4b1f-9f76-0459cb6d4184\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.155094 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.155060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6x2\" (UniqueName: \"kubernetes.io/projected/192a48bd-bcb6-4fec-9fa3-cd24f83284be-kube-api-access-qt6x2\") pod \"router-default-69794ff49d-n7xjz\" (UID: \"192a48bd-bcb6-4fec-9fa3-cd24f83284be\") " pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.158421 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.158397 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlqxg\" (UniqueName: \"kubernetes.io/projected/7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd-kube-api-access-rlqxg\") pod \"console-operator-9d4b6777b-4mhc4\" (UID: \"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd\") " pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.218879 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.218849 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" Apr 20 11:42:58.227688 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.227653 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:42:58.233975 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.233945 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:42:58.234316 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.234293 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg"] Apr 20 11:42:58.239342 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:58.239307 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod904766d7_451c_4378_8dbf_8486dbbd70e6.slice/crio-4ed46f3364575072811612cc77af516501f99ad1c9c46624771d0bfc639ec03d WatchSource:0}: Error finding container 4ed46f3364575072811612cc77af516501f99ad1c9c46624771d0bfc639ec03d: Status 404 returned error can't find the container with id 4ed46f3364575072811612cc77af516501f99ad1c9c46624771d0bfc639ec03d Apr 20 11:42:58.379570 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.379495 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp"] Apr 20 11:42:58.382962 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:58.382931 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7b624b2_8718_4b1f_9f76_0459cb6d4184.slice/crio-7b02a37d0098d29dbf87e6c47852e81f2d5ea8794657fc77f027618dff66a6b9 WatchSource:0}: Error finding container 7b02a37d0098d29dbf87e6c47852e81f2d5ea8794657fc77f027618dff66a6b9: Status 404 returned error can't find the container with id 7b02a37d0098d29dbf87e6c47852e81f2d5ea8794657fc77f027618dff66a6b9 Apr 20 11:42:58.394543 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.394524 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5crpn\"" Apr 20 11:42:58.402792 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.402773 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lvlmv" Apr 20 11:42:58.416991 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.416967 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-twcdr\"" Apr 20 11:42:58.425353 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.425331 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8sqkc" Apr 20 11:42:58.531119 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.531082 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lvlmv"] Apr 20 11:42:58.533294 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:58.533267 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53d403af_81c7_4e78_8fe5_d31a6f123b4b.slice/crio-efc155048ec97bbaba9a191842abff28ad19a775dbe261a612e5f52108d645d8 WatchSource:0}: Error finding container efc155048ec97bbaba9a191842abff28ad19a775dbe261a612e5f52108d645d8: Status 404 returned error can't find the container with id efc155048ec97bbaba9a191842abff28ad19a775dbe261a612e5f52108d645d8 Apr 20 11:42:58.557727 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.557700 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8sqkc"] Apr 20 11:42:58.561170 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:58.561128 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3fe8f50_9343_4a7e_8938_2d2334926942.slice/crio-9b2b5a23f388aeff2514bad7f391ba25058d754840adf4d22bc9764e0f225a57 WatchSource:0}: Error finding container 9b2b5a23f388aeff2514bad7f391ba25058d754840adf4d22bc9764e0f225a57: Status 404 returned error can't find the container with id 9b2b5a23f388aeff2514bad7f391ba25058d754840adf4d22bc9764e0f225a57 Apr 20 11:42:58.595108 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.595079 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-69794ff49d-n7xjz"] Apr 20 11:42:58.599137 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:58.599102 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod192a48bd_bcb6_4fec_9fa3_cd24f83284be.slice/crio-89f437f48822fff93824c7df9fe4c5ebb6801e0770decb6265c90628cbfa0511 WatchSource:0}: Error finding container 89f437f48822fff93824c7df9fe4c5ebb6801e0770decb6265c90628cbfa0511: Status 404 returned error can't find the container with id 89f437f48822fff93824c7df9fe4c5ebb6801e0770decb6265c90628cbfa0511 Apr 20 11:42:58.600702 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.600644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" event={"ID":"f7b624b2-8718-4b1f-9f76-0459cb6d4184","Type":"ContainerStarted","Data":"7b02a37d0098d29dbf87e6c47852e81f2d5ea8794657fc77f027618dff66a6b9"} Apr 20 11:42:58.602154 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.602132 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4mhc4"] Apr 20 11:42:58.602884 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.602853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg" event={"ID":"904766d7-451c-4378-8dbf-8486dbbd70e6","Type":"ContainerStarted","Data":"4ed46f3364575072811612cc77af516501f99ad1c9c46624771d0bfc639ec03d"} Apr 20 11:42:58.604047 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.604011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lvlmv" event={"ID":"53d403af-81c7-4e78-8fe5-d31a6f123b4b","Type":"ContainerStarted","Data":"efc155048ec97bbaba9a191842abff28ad19a775dbe261a612e5f52108d645d8"} Apr 20 11:42:58.604970 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:42:58.604937 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a58fa4b_ae1b_451e_ab5e_4aaf7713ebdd.slice/crio-53414775eec685cd1abe272193d4eca359679543d1766023d870c5ef42b47cee WatchSource:0}: Error finding container 53414775eec685cd1abe272193d4eca359679543d1766023d870c5ef42b47cee: Status 404 returned error can't find the container with id 53414775eec685cd1abe272193d4eca359679543d1766023d870c5ef42b47cee Apr 20 11:42:58.605168 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.605137 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8sqkc" event={"ID":"e3fe8f50-9343-4a7e-8938-2d2334926942","Type":"ContainerStarted","Data":"9b2b5a23f388aeff2514bad7f391ba25058d754840adf4d22bc9764e0f225a57"} Apr 20 11:42:58.605264 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.605199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:58.612804 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.612782 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:58.736308 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736269 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-trusted-ca\") pod \"8796d702-f320-49e1-9033-817a78763256\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " Apr 20 11:42:58.736499 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736321 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8796d702-f320-49e1-9033-817a78763256-ca-trust-extracted\") pod \"8796d702-f320-49e1-9033-817a78763256\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " Apr 20 11:42:58.736499 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736352 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") pod \"8796d702-f320-49e1-9033-817a78763256\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " Apr 20 11:42:58.736499 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736385 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-installation-pull-secrets\") pod \"8796d702-f320-49e1-9033-817a78763256\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " Apr 20 11:42:58.736499 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736413 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-image-registry-private-configuration\") pod \"8796d702-f320-49e1-9033-817a78763256\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " Apr 20 11:42:58.736499 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736460 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p2tm\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-kube-api-access-2p2tm\") pod \"8796d702-f320-49e1-9033-817a78763256\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " Apr 20 11:42:58.736499 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736487 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-registry-certificates\") pod \"8796d702-f320-49e1-9033-817a78763256\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " Apr 20 11:42:58.736776 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736528 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-bound-sa-token\") pod \"8796d702-f320-49e1-9033-817a78763256\" (UID: \"8796d702-f320-49e1-9033-817a78763256\") " Apr 20 11:42:58.736776 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736576 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8796d702-f320-49e1-9033-817a78763256-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8796d702-f320-49e1-9033-817a78763256" (UID: "8796d702-f320-49e1-9033-817a78763256"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 11:42:58.736776 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.736762 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8796d702-f320-49e1-9033-817a78763256-ca-trust-extracted\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:42:58.737102 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.737051 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8796d702-f320-49e1-9033-817a78763256" (UID: "8796d702-f320-49e1-9033-817a78763256"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:42:58.737102 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.737139 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8796d702-f320-49e1-9033-817a78763256" (UID: "8796d702-f320-49e1-9033-817a78763256"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:42:58.739408 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.739368 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8796d702-f320-49e1-9033-817a78763256" (UID: "8796d702-f320-49e1-9033-817a78763256"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:42:58.739579 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.739460 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8796d702-f320-49e1-9033-817a78763256" (UID: "8796d702-f320-49e1-9033-817a78763256"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:42:58.739953 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.739918 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8796d702-f320-49e1-9033-817a78763256" (UID: "8796d702-f320-49e1-9033-817a78763256"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:42:58.740041 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.739925 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8796d702-f320-49e1-9033-817a78763256" (UID: "8796d702-f320-49e1-9033-817a78763256"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:42:58.740179 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.740153 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-kube-api-access-2p2tm" (OuterVolumeSpecName: "kube-api-access-2p2tm") pod "8796d702-f320-49e1-9033-817a78763256" (UID: "8796d702-f320-49e1-9033-817a78763256"). InnerVolumeSpecName "kube-api-access-2p2tm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:42:58.838117 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.838087 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-registry-tls\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:42:58.838117 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.838117 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-installation-pull-secrets\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:42:58.838588 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.838131 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8796d702-f320-49e1-9033-817a78763256-image-registry-private-configuration\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:42:58.838588 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.838144 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2p2tm\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-kube-api-access-2p2tm\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:42:58.838588 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.838159 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-registry-certificates\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:42:58.838588 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.838173 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8796d702-f320-49e1-9033-817a78763256-bound-sa-token\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:42:58.838588 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:58.838186 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8796d702-f320-49e1-9033-817a78763256-trusted-ca\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:42:59.610851 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:59.610756 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" event={"ID":"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd","Type":"ContainerStarted","Data":"53414775eec685cd1abe272193d4eca359679543d1766023d870c5ef42b47cee"} Apr 20 11:42:59.613021 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:59.612993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b66586b66-hghv5" Apr 20 11:42:59.614066 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:59.614033 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69794ff49d-n7xjz" event={"ID":"192a48bd-bcb6-4fec-9fa3-cd24f83284be","Type":"ContainerStarted","Data":"fd560dcfe9f9b1189c201baca256a2a85c4e1f599cfaf5c324e3bda00e3d9534"} Apr 20 11:42:59.614213 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:59.614114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-69794ff49d-n7xjz" event={"ID":"192a48bd-bcb6-4fec-9fa3-cd24f83284be","Type":"ContainerStarted","Data":"89f437f48822fff93824c7df9fe4c5ebb6801e0770decb6265c90628cbfa0511"} Apr 20 11:42:59.638537 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:59.637360 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-69794ff49d-n7xjz" podStartSLOduration=2.63734137 podStartE2EDuration="2.63734137s" podCreationTimestamp="2026-04-20 11:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:42:59.636856579 +0000 UTC m=+65.760275827" watchObservedRunningTime="2026-04-20 11:42:59.63734137 +0000 UTC m=+65.760760621" Apr 20 11:42:59.670441 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:59.670380 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b66586b66-hghv5"] Apr 20 11:42:59.673473 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:42:59.673444 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-b66586b66-hghv5"] Apr 20 11:43:00.150134 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:00.150078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:43:00.153146 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:00.153117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d9165296-57f0-4590-ad83-189871356a1a-metrics-certs\") pod \"network-metrics-daemon-4lcnh\" (UID: \"d9165296-57f0-4590-ad83-189871356a1a\") " pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:43:00.235109 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:00.235070 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:43:00.237842 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:00.237818 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:43:00.335118 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:00.335088 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8796d702-f320-49e1-9033-817a78763256" path="/var/lib/kubelet/pods/8796d702-f320-49e1-9033-817a78763256/volumes" Apr 20 11:43:00.351665 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:00.351636 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lppl4\"" Apr 20 11:43:00.359334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:00.359309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4lcnh" Apr 20 11:43:00.615728 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:00.615693 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:43:00.617181 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:00.617154 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-69794ff49d-n7xjz" Apr 20 11:43:03.010817 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.010765 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4lcnh"] Apr 20 11:43:03.016483 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:43:03.016450 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9165296_57f0_4590_ad83_189871356a1a.slice/crio-8ba5750da8dc7955fdfe82f5c57888ad58104b8c2149e21d36690de92d7d6711 WatchSource:0}: Error finding container 8ba5750da8dc7955fdfe82f5c57888ad58104b8c2149e21d36690de92d7d6711: Status 404 returned error can't find the container with id 8ba5750da8dc7955fdfe82f5c57888ad58104b8c2149e21d36690de92d7d6711 Apr 20 11:43:03.545753 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.545519 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qzgrd" Apr 20 11:43:03.625871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.625819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4lcnh" event={"ID":"d9165296-57f0-4590-ad83-189871356a1a","Type":"ContainerStarted","Data":"8ba5750da8dc7955fdfe82f5c57888ad58104b8c2149e21d36690de92d7d6711"} Apr 20 11:43:03.628613 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.628561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8sqkc" event={"ID":"e3fe8f50-9343-4a7e-8938-2d2334926942","Type":"ContainerStarted","Data":"a739f7caf9619fc14de2fbcadfbb78b2bd7c56f85f4a47483993b13a3efb5bc9"} Apr 20 11:43:03.630064 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.630035 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" event={"ID":"7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd","Type":"ContainerStarted","Data":"0a3439164849dcbfd0030eaff73af4868824ee03e0e5f57ff423059f66d84aeb"} Apr 20 11:43:03.630807 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.630786 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:43:03.632420 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.632397 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" event={"ID":"f7b624b2-8718-4b1f-9f76-0459cb6d4184","Type":"ContainerStarted","Data":"730ab1635b4b4989256aa9a6f92cf8fff7c0c612d6dba97803c9f5b9afb2c9e2"} Apr 20 11:43:03.633937 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.633914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg" event={"ID":"904766d7-451c-4378-8dbf-8486dbbd70e6","Type":"ContainerStarted","Data":"b44cddf8ab9ab7819dd6fe27354e7b13979e8a8abbb8c526c63128990e787d28"} Apr 20 11:43:03.636166 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.636121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lvlmv" event={"ID":"53d403af-81c7-4e78-8fe5-d31a6f123b4b","Type":"ContainerStarted","Data":"ace0bad43955826978b268f577fe5ae8f30849979e8f8d127877a18a12fa632f"} Apr 20 11:43:03.636166 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.636146 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lvlmv" event={"ID":"53d403af-81c7-4e78-8fe5-d31a6f123b4b","Type":"ContainerStarted","Data":"d6637d171ee6cbc95f82fad59e5f5dcdc7de24f4bb816210eff0f7e0167b4449"} Apr 20 11:43:03.636418 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.636394 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lvlmv" Apr 20 11:43:03.653329 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.653264 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8sqkc" podStartSLOduration=33.335172755 podStartE2EDuration="37.653219102s" podCreationTimestamp="2026-04-20 11:42:26 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.563381251 +0000 UTC m=+64.686800476" lastFinishedPulling="2026-04-20 11:43:02.881427591 +0000 UTC m=+69.004846823" observedRunningTime="2026-04-20 11:43:03.652742284 +0000 UTC m=+69.776161531" watchObservedRunningTime="2026-04-20 11:43:03.653219102 +0000 UTC m=+69.776638347" Apr 20 11:43:03.707363 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.707313 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" podStartSLOduration=2.438235073 podStartE2EDuration="6.707296308s" podCreationTimestamp="2026-04-20 11:42:57 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.606804684 +0000 UTC m=+64.730223910" lastFinishedPulling="2026-04-20 11:43:02.875865913 +0000 UTC m=+68.999285145" observedRunningTime="2026-04-20 11:43:03.70680371 +0000 UTC m=+69.830222958" watchObservedRunningTime="2026-04-20 11:43:03.707296308 +0000 UTC m=+69.830715555" Apr 20 11:43:03.707985 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.707656 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qrlg" podStartSLOduration=2.073454848 podStartE2EDuration="6.70764835s" podCreationTimestamp="2026-04-20 11:42:57 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.241670343 +0000 UTC m=+64.365089574" lastFinishedPulling="2026-04-20 11:43:02.875863851 +0000 UTC m=+68.999283076" observedRunningTime="2026-04-20 11:43:03.680386513 +0000 UTC m=+69.803805764" watchObservedRunningTime="2026-04-20 11:43:03.70764835 +0000 UTC m=+69.831067597" Apr 20 11:43:03.731670 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.731606 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lvlmv" podStartSLOduration=33.390731672 podStartE2EDuration="37.731590432s" podCreationTimestamp="2026-04-20 11:42:26 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.535003552 +0000 UTC m=+64.658422780" lastFinishedPulling="2026-04-20 11:43:02.87586231 +0000 UTC m=+68.999281540" observedRunningTime="2026-04-20 11:43:03.730592544 +0000 UTC m=+69.854011792" watchObservedRunningTime="2026-04-20 11:43:03.731590432 +0000 UTC m=+69.855009680" Apr 20 11:43:03.758147 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:03.758080 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-khfpp" podStartSLOduration=2.26567858 podStartE2EDuration="6.758059686s" podCreationTimestamp="2026-04-20 11:42:57 +0000 UTC" firstStartedPulling="2026-04-20 11:42:58.384780049 +0000 UTC m=+64.508199274" lastFinishedPulling="2026-04-20 11:43:02.877161142 +0000 UTC m=+69.000580380" observedRunningTime="2026-04-20 11:43:03.756586833 +0000 UTC m=+69.880006081" watchObservedRunningTime="2026-04-20 11:43:03.758059686 +0000 UTC m=+69.881478934" Apr 20 11:43:04.110766 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.110734 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-4mhc4" Apr 20 11:43:04.283743 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.283708 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-m82f2"] Apr 20 11:43:04.298596 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.298558 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m82f2"] Apr 20 11:43:04.298748 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.298634 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m82f2" Apr 20 11:43:04.301410 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.301388 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 11:43:04.301548 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.301394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-bk9fm\"" Apr 20 11:43:04.301548 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.301394 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 11:43:04.384862 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.384771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs24n\" (UniqueName: \"kubernetes.io/projected/ea955ee0-5a1d-4e72-bca6-f985586b22e1-kube-api-access-vs24n\") pod \"downloads-6bcc868b7-m82f2\" (UID: \"ea955ee0-5a1d-4e72-bca6-f985586b22e1\") " pod="openshift-console/downloads-6bcc868b7-m82f2" Apr 20 11:43:04.485557 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.485517 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs24n\" (UniqueName: \"kubernetes.io/projected/ea955ee0-5a1d-4e72-bca6-f985586b22e1-kube-api-access-vs24n\") pod \"downloads-6bcc868b7-m82f2\" (UID: \"ea955ee0-5a1d-4e72-bca6-f985586b22e1\") " pod="openshift-console/downloads-6bcc868b7-m82f2" Apr 20 11:43:04.494525 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.494495 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs24n\" (UniqueName: \"kubernetes.io/projected/ea955ee0-5a1d-4e72-bca6-f985586b22e1-kube-api-access-vs24n\") pod \"downloads-6bcc868b7-m82f2\" (UID: \"ea955ee0-5a1d-4e72-bca6-f985586b22e1\") " pod="openshift-console/downloads-6bcc868b7-m82f2" Apr 20 11:43:04.609397 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.609357 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-m82f2" Apr 20 11:43:04.755920 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:04.755889 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-m82f2"] Apr 20 11:43:04.759188 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:43:04.759160 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea955ee0_5a1d_4e72_bca6_f985586b22e1.slice/crio-4a65050662396de064851d591ca1b46cd9dbc801f3a2124e59926bb5e2c6a5a2 WatchSource:0}: Error finding container 4a65050662396de064851d591ca1b46cd9dbc801f3a2124e59926bb5e2c6a5a2: Status 404 returned error can't find the container with id 4a65050662396de064851d591ca1b46cd9dbc801f3a2124e59926bb5e2c6a5a2 Apr 20 11:43:05.644339 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:05.644287 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m82f2" event={"ID":"ea955ee0-5a1d-4e72-bca6-f985586b22e1","Type":"ContainerStarted","Data":"4a65050662396de064851d591ca1b46cd9dbc801f3a2124e59926bb5e2c6a5a2"} Apr 20 11:43:05.646291 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:05.646259 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4lcnh" event={"ID":"d9165296-57f0-4590-ad83-189871356a1a","Type":"ContainerStarted","Data":"26b8fff947989d6443bddbd26994b6ac07f330e36092ff41c31458cfffe2d42c"} Apr 20 11:43:05.646414 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:05.646300 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4lcnh" event={"ID":"d9165296-57f0-4590-ad83-189871356a1a","Type":"ContainerStarted","Data":"a6250c8c7d35b45847bf2881ad25e75b8b82107c8bf1dddd9504618aca40d0a3"} Apr 20 11:43:05.664640 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:05.664576 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4lcnh" podStartSLOduration=70.124725624 podStartE2EDuration="1m11.664559372s" podCreationTimestamp="2026-04-20 11:41:54 +0000 UTC" firstStartedPulling="2026-04-20 11:43:03.018965178 +0000 UTC m=+69.142384403" lastFinishedPulling="2026-04-20 11:43:04.558798915 +0000 UTC m=+70.682218151" observedRunningTime="2026-04-20 11:43:05.664281554 +0000 UTC m=+71.787700802" watchObservedRunningTime="2026-04-20 11:43:05.664559372 +0000 UTC m=+71.787978619" Apr 20 11:43:08.264454 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:08.264404 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lvlmv_53d403af-81c7-4e78-8fe5-d31a6f123b4b/dns/0.log" Apr 20 11:43:08.445736 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:08.445707 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lvlmv_53d403af-81c7-4e78-8fe5-d31a6f123b4b/kube-rbac-proxy/0.log" Apr 20 11:43:08.845044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:08.845014 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fq5mw_9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552/dns-node-resolver/0.log" Apr 20 11:43:09.124067 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.123973 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9rcnn"] Apr 20 11:43:09.127085 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.127055 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.131457 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.131309 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 11:43:09.131457 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.131325 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 11:43:09.131457 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.131334 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-lb4xc\"" Apr 20 11:43:09.131457 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.131350 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 11:43:09.131457 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.131326 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 11:43:09.131457 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.131309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 11:43:09.138101 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.137177 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9rcnn"] Apr 20 11:43:09.226722 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.226690 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.226896 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.226737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.226896 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.226862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.227024 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.226912 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wndw\" (UniqueName: \"kubernetes.io/projected/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-kube-api-access-8wndw\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.327413 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.327378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.327413 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.327415 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wndw\" (UniqueName: \"kubernetes.io/projected/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-kube-api-access-8wndw\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.327871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.327450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.327871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.327511 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.327871 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:43:09.327693 2578 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 11:43:09.327871 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:43:09.327765 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-prometheus-operator-tls podName:fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d nodeName:}" failed. No retries permitted until 2026-04-20 11:43:09.827743983 +0000 UTC m=+75.951163214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-9rcnn" (UID: "fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d") : secret "prometheus-operator-tls" not found Apr 20 11:43:09.328499 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.328474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.330336 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.330311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.338190 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.338162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wndw\" (UniqueName: \"kubernetes.io/projected/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-kube-api-access-8wndw\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.831411 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.831373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.834320 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.834293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9rcnn\" (UID: \"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:09.844671 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:09.844643 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ltzgg_8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d/node-ca/0.log" Apr 20 11:43:10.039282 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:10.039231 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" Apr 20 11:43:10.177035 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:10.176968 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9rcnn"] Apr 20 11:43:10.180484 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:43:10.180449 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb6784c3_659d_48bd_a0c2_ff3dcb04ee3d.slice/crio-4b18bb9d08214aa62149903e91800ca5578d11de5e7f368c7d388906477af116 WatchSource:0}: Error finding container 4b18bb9d08214aa62149903e91800ca5578d11de5e7f368c7d388906477af116: Status 404 returned error can't find the container with id 4b18bb9d08214aa62149903e91800ca5578d11de5e7f368c7d388906477af116 Apr 20 11:43:10.245929 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:10.245898 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69794ff49d-n7xjz_192a48bd-bcb6-4fec-9fa3-cd24f83284be/router/0.log" Apr 20 11:43:10.445758 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:10.445673 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8sqkc_e3fe8f50-9343-4a7e-8938-2d2334926942/serve-healthcheck-canary/0.log" Apr 20 11:43:10.667887 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:10.667834 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" event={"ID":"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d","Type":"ContainerStarted","Data":"4b18bb9d08214aa62149903e91800ca5578d11de5e7f368c7d388906477af116"} Apr 20 11:43:11.449876 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:11.449756 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-khfpp_f7b624b2-8718-4b1f-9f76-0459cb6d4184/kube-storage-version-migrator-operator/0.log" Apr 20 11:43:12.675913 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:12.675869 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" event={"ID":"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d","Type":"ContainerStarted","Data":"b2085f5c4b714ad8c8b88e28449bd4bc0d73ceb2c0216f9fab025a95796bc3fc"} Apr 20 11:43:12.675913 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:12.675917 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" event={"ID":"fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d","Type":"ContainerStarted","Data":"9ca29d8809559645a2da481dec9e2cfa44e0ad7da9ea95537d3ffafba33b5ad8"} Apr 20 11:43:12.698492 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:12.698440 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-9rcnn" podStartSLOduration=2.229125198 podStartE2EDuration="3.698426057s" podCreationTimestamp="2026-04-20 11:43:09 +0000 UTC" firstStartedPulling="2026-04-20 11:43:10.182542853 +0000 UTC m=+76.305962078" lastFinishedPulling="2026-04-20 11:43:11.651843696 +0000 UTC m=+77.775262937" observedRunningTime="2026-04-20 11:43:12.696806445 +0000 UTC m=+78.820225693" watchObservedRunningTime="2026-04-20 11:43:12.698426057 +0000 UTC m=+78.821845303" Apr 20 11:43:13.075621 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.075583 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zps6q"] Apr 20 11:43:13.109989 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.109954 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zps6q"] Apr 20 11:43:13.110164 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.110136 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.113939 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.113908 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rffgp\"" Apr 20 11:43:13.114081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.113944 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 11:43:13.114081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.113948 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 11:43:13.114081 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.114015 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 11:43:13.114272 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.114130 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 11:43:13.160658 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.160614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/46495978-0693-4293-ac30-560a6e13e86c-data-volume\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.160818 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.160668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/46495978-0693-4293-ac30-560a6e13e86c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.160818 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.160710 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtrb2\" (UniqueName: \"kubernetes.io/projected/46495978-0693-4293-ac30-560a6e13e86c-kube-api-access-xtrb2\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.160818 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.160793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/46495978-0693-4293-ac30-560a6e13e86c-crio-socket\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.160958 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.160818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/46495978-0693-4293-ac30-560a6e13e86c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.261700 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.261665 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/46495978-0693-4293-ac30-560a6e13e86c-crio-socket\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.261700 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.261703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/46495978-0693-4293-ac30-560a6e13e86c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.261935 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.261734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/46495978-0693-4293-ac30-560a6e13e86c-data-volume\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.261935 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.261759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/46495978-0693-4293-ac30-560a6e13e86c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.261935 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.261801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtrb2\" (UniqueName: \"kubernetes.io/projected/46495978-0693-4293-ac30-560a6e13e86c-kube-api-access-xtrb2\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.261935 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.261894 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/46495978-0693-4293-ac30-560a6e13e86c-crio-socket\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.262210 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.262189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/46495978-0693-4293-ac30-560a6e13e86c-data-volume\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.262591 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.262568 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/46495978-0693-4293-ac30-560a6e13e86c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.265130 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.265101 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/46495978-0693-4293-ac30-560a6e13e86c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.273413 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.273384 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtrb2\" (UniqueName: \"kubernetes.io/projected/46495978-0693-4293-ac30-560a6e13e86c-kube-api-access-xtrb2\") pod \"insights-runtime-extractor-zps6q\" (UID: \"46495978-0693-4293-ac30-560a6e13e86c\") " pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.422160 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.422066 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zps6q" Apr 20 11:43:13.574456 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.574426 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zps6q"] Apr 20 11:43:13.642942 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.642889 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lvlmv" Apr 20 11:43:13.680937 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:13.680354 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zps6q" event={"ID":"46495978-0693-4293-ac30-560a6e13e86c","Type":"ContainerStarted","Data":"aebbb1bb85ce2a934b422f664cc2a3ec60470a9798e027ad7753d7b2277b656a"} Apr 20 11:43:14.537677 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.537637 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vfzd7"] Apr 20 11:43:14.542494 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.542467 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.546509 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.546480 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 11:43:14.546933 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.546910 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 11:43:14.547135 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.547019 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 11:43:14.547198 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.547171 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5j4wd\"" Apr 20 11:43:14.673591 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.673550 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-tls\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.673772 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.673616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-wtmp\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.673772 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.673652 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-metrics-client-ca\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.673772 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.673681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58k52\" (UniqueName: \"kubernetes.io/projected/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-kube-api-access-58k52\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.673772 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.673753 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-accelerators-collector-config\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.673957 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.673789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.673957 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.673821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-textfile\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.673957 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.673856 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-root\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.673957 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.673885 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-sys\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.684700 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.684664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zps6q" event={"ID":"46495978-0693-4293-ac30-560a6e13e86c","Type":"ContainerStarted","Data":"3c6ff0a35e584ef8639bec6b88b37a260c82b08022d880390986380e6e20244c"} Apr 20 11:43:14.775146 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-tls\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-wtmp\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775375 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-metrics-client-ca\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775375 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:43:14.775349 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 11:43:14.775547 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:43:14.775428 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-tls podName:9a4a187b-9bc0-45c2-b2f3-2f91c2357f40 nodeName:}" failed. No retries permitted until 2026-04-20 11:43:15.275402611 +0000 UTC m=+81.398821853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-tls") pod "node-exporter-vfzd7" (UID: "9a4a187b-9bc0-45c2-b2f3-2f91c2357f40") : secret "node-exporter-tls" not found Apr 20 11:43:14.775547 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775319 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-wtmp\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775547 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775353 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58k52\" (UniqueName: \"kubernetes.io/projected/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-kube-api-access-58k52\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775547 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-accelerators-collector-config\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775749 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775557 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775749 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-textfile\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775749 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-root\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775749 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-sys\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775749 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775740 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-sys\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775980 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-root\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.775980 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775881 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-metrics-client-ca\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.776104 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.775995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-accelerators-collector-config\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.776104 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.776050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-textfile\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.778197 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.778177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:14.785951 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:14.785910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58k52\" (UniqueName: \"kubernetes.io/projected/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-kube-api-access-58k52\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:15.280560 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.280524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-tls\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:15.283545 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.283515 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a4a187b-9bc0-45c2-b2f3-2f91c2357f40-node-exporter-tls\") pod \"node-exporter-vfzd7\" (UID: \"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40\") " pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:15.455543 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.455502 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vfzd7" Apr 20 11:43:15.642015 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.641929 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c5cb569d4-mn8ck"] Apr 20 11:43:15.646539 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.646511 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.653464 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.653436 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-krrnw\"" Apr 20 11:43:15.653655 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.653464 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 11:43:15.655426 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.655404 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 11:43:15.655613 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.655593 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 11:43:15.655715 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.655642 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 11:43:15.655898 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.655882 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 11:43:15.661682 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.661654 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 11:43:15.663271 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.663218 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c5cb569d4-mn8ck"] Apr 20 11:43:15.756466 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.756424 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 11:43:15.760580 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.760552 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.763871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.763695 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 11:43:15.763871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.763702 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 11:43:15.764365 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.763964 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 11:43:15.764365 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.764059 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 11:43:15.764365 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.764186 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kt6q5\"" Apr 20 11:43:15.764365 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.764319 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 11:43:15.764699 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.764681 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 11:43:15.764815 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.764726 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 11:43:15.764948 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.764934 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 11:43:15.765040 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.764944 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 11:43:15.778498 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.778447 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 11:43:15.785743 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.785707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-trusted-ca-bundle\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.785912 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.785772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-oauth-serving-cert\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.785912 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.785836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-serving-cert\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.785912 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.785865 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-config\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.785912 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.785891 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-service-ca\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.786108 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.785918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-oauth-config\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.786108 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.785968 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fs4\" (UniqueName: \"kubernetes.io/projected/a7c6c9c5-9b0b-4d52-90db-f8193effb884-kube-api-access-p6fs4\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.886971 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.886934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-config-volume\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.886971 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.886985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887207 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fs4\" (UniqueName: \"kubernetes.io/projected/a7c6c9c5-9b0b-4d52-90db-f8193effb884-kube-api-access-p6fs4\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.887207 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887207 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887075 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887207 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887207 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-trusted-ca-bundle\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.887207 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82ld4\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-kube-api-access-82ld4\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887207 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-oauth-serving-cert\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.887207 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887274 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887337 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-web-config\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-serving-cert\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887403 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-config\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-service-ca\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-config-out\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887477 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-oauth-config\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.887535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.887503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.889288 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.889214 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-config\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.889449 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.889381 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-service-ca\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.889449 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.889410 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-oauth-serving-cert\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.889449 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.889420 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-trusted-ca-bundle\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.891767 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.891744 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-serving-cert\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.891884 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.891789 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-oauth-config\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.897387 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.897315 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fs4\" (UniqueName: \"kubernetes.io/projected/a7c6c9c5-9b0b-4d52-90db-f8193effb884-kube-api-access-p6fs4\") pod \"console-7c5cb569d4-mn8ck\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.958318 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.958284 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:15.988432 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988398 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82ld4\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-kube-api-access-82ld4\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988501 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988535 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-web-config\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988608 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-config-out\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988840 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:43:15.988614 2578 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 11:43:15.988840 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988627 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988840 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988661 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-config-volume\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988840 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:43:15.988693 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-main-tls podName:49b3596d-08f0-4661-9c8c-dc64d845513f nodeName:}" failed. No retries permitted until 2026-04-20 11:43:16.48867267 +0000 UTC m=+82.612091910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f") : secret "alertmanager-main-tls" not found Apr 20 11:43:15.988840 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988840 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.988840 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.989066 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.988840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.990465 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.989841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.990465 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.990359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.990966 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.990788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.991833 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.991805 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-config-out\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.992027 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.992003 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-config-volume\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.993235 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.993205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-web-config\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.993357 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.993281 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.993616 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.993580 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.993692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.993678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.994028 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.994004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.994105 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.994031 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:15.997614 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:15.997589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82ld4\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-kube-api-access-82ld4\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:16.493426 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.493387 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:16.496476 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.496452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:16.663308 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.663269 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6685d9887c-vsnfc"] Apr 20 11:43:16.668630 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.668605 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.671631 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.671606 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 11:43:16.671775 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.671705 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-2pcdh\"" Apr 20 11:43:16.671965 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.671945 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 11:43:16.672051 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.671990 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:43:16.672110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.672079 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 11:43:16.672258 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.672215 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 11:43:16.672383 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.672285 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 11:43:16.672465 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.672447 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fuqjrci346jh\"" Apr 20 11:43:16.677318 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.677293 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6685d9887c-vsnfc"] Apr 20 11:43:16.795383 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.795349 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.795841 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.795392 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/345aa0c5-2e04-44b5-800d-435f23ed96de-metrics-client-ca\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.795841 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.795432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-grpc-tls\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.795841 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.795500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.795841 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.795588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.795841 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.795635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbkvm\" (UniqueName: \"kubernetes.io/projected/345aa0c5-2e04-44b5-800d-435f23ed96de-kube-api-access-cbkvm\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.795841 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.795712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-tls\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.795841 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.795770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.896382 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.896337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.896577 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.896400 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/345aa0c5-2e04-44b5-800d-435f23ed96de-metrics-client-ca\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.896577 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.896435 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-grpc-tls\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.896577 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.896468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.896577 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.896498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.896577 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.896541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbkvm\" (UniqueName: \"kubernetes.io/projected/345aa0c5-2e04-44b5-800d-435f23ed96de-kube-api-access-cbkvm\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.896844 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.896578 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-tls\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.896844 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.896619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.897203 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.897168 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/345aa0c5-2e04-44b5-800d-435f23ed96de-metrics-client-ca\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.900424 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.900354 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-grpc-tls\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.900424 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.900395 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.900701 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.900539 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.900797 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.900766 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.901199 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.901174 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.901339 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.901287 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/345aa0c5-2e04-44b5-800d-435f23ed96de-secret-thanos-querier-tls\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.907351 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.907301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbkvm\" (UniqueName: \"kubernetes.io/projected/345aa0c5-2e04-44b5-800d-435f23ed96de-kube-api-access-cbkvm\") pod \"thanos-querier-6685d9887c-vsnfc\" (UID: \"345aa0c5-2e04-44b5-800d-435f23ed96de\") " pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:16.981659 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:16.981615 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:20.899921 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.898686 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 11:43:20.919780 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.919748 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 11:43:20.921340 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.920125 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:20.926229 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.925543 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-2hwpv\"" Apr 20 11:43:20.930119 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.929917 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.930359 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.930971 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.931888 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.932033 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.932162 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.932530 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.932591 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.932729 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.932816 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.932893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 11:43:20.934076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.933075 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-548tscpr96qlm\"" Apr 20 11:43:20.934806 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.934783 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 11:43:20.936430 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.936403 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6685d9887c-vsnfc"] Apr 20 11:43:20.939418 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:20.939196 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 11:43:21.041940 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.041722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.041940 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.041785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.041940 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.041833 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.041940 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.041873 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.041940 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.041902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-web-config\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042015 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042075 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config-out\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042100 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042130 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxqt\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-kube-api-access-9sxqt\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042154 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042199 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042296 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042370 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042842 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042842 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042438 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.042842 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.042492 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143504 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143637 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143637 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143547 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143637 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143637 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143717 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-web-config\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.143871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config-out\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.144147 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.144147 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxqt\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-kube-api-access-9sxqt\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.144147 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.143929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.145157 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.144336 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.146635 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.145709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.146635 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.146149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.146635 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.146393 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.146854 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.146814 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.146917 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.146866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.146917 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.146902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.147015 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.146939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.147062 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.147022 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.157724 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.151162 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.157724 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.151745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.157724 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.151801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.157724 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.151905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.157724 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.155563 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.158208 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.158184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.158967 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.158939 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config-out\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.159569 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.159537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-web-config\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.159671 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.159572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.159882 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.159858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.161202 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.161016 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.163029 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.162993 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxqt\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-kube-api-access-9sxqt\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.163192 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.163016 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.166548 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.166523 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c5cb569d4-mn8ck"] Apr 20 11:43:21.173283 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:43:21.173234 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c6c9c5_9b0b_4d52_90db_f8193effb884.slice/crio-d0b602c8e48c708d85049bb34d555ed14ffe86fcb4bdae243fe08f72c073c694 WatchSource:0}: Error finding container d0b602c8e48c708d85049bb34d555ed14ffe86fcb4bdae243fe08f72c073c694: Status 404 returned error can't find the container with id d0b602c8e48c708d85049bb34d555ed14ffe86fcb4bdae243fe08f72c073c694 Apr 20 11:43:21.181998 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.181937 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 11:43:21.192587 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:43:21.192561 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49b3596d_08f0_4661_9c8c_dc64d845513f.slice/crio-a1cc807cd5f22c4f90e185de6cf9fae4d36496f277894adc47024a397a4e9a4a WatchSource:0}: Error finding container a1cc807cd5f22c4f90e185de6cf9fae4d36496f277894adc47024a397a4e9a4a: Status 404 returned error can't find the container with id a1cc807cd5f22c4f90e185de6cf9fae4d36496f277894adc47024a397a4e9a4a Apr 20 11:43:21.241463 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.241382 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:21.417342 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.417303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 11:43:21.421619 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:43:21.421579 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod232fa66a_5df6_4947_8ac8_51c68ea12a9c.slice/crio-e6f04c6534887e9403bca37cc0af0f286dec76b22277802c651b235d2d1f96ba WatchSource:0}: Error finding container e6f04c6534887e9403bca37cc0af0f286dec76b22277802c651b235d2d1f96ba: Status 404 returned error can't find the container with id e6f04c6534887e9403bca37cc0af0f286dec76b22277802c651b235d2d1f96ba Apr 20 11:43:21.714181 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.714138 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-m82f2" event={"ID":"ea955ee0-5a1d-4e72-bca6-f985586b22e1","Type":"ContainerStarted","Data":"cc9270b69aafde3ad7788288c552c7baa3efc886c57d54f73bfb6cf4e9269043"} Apr 20 11:43:21.714723 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.714649 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-m82f2" Apr 20 11:43:21.716117 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.715964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerStarted","Data":"a1cc807cd5f22c4f90e185de6cf9fae4d36496f277894adc47024a397a4e9a4a"} Apr 20 11:43:21.718066 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.718036 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vfzd7" event={"ID":"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40","Type":"ContainerStarted","Data":"564cd04650824807fe138e010f1e7a16e2e33e82900d640aac15d88f7ebe61cd"} Apr 20 11:43:21.719835 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.719804 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" event={"ID":"345aa0c5-2e04-44b5-800d-435f23ed96de","Type":"ContainerStarted","Data":"45b289bffa252f973a1b100ccd2d62a5dd8ab321a90ee176cd5ae36302e55ad6"} Apr 20 11:43:21.723605 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.723577 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zps6q" event={"ID":"46495978-0693-4293-ac30-560a6e13e86c","Type":"ContainerStarted","Data":"997eec12e50f5389850ef1c8b9a361e513f706a9dcc0370b236b66f903f5c188"} Apr 20 11:43:21.725880 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.725836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerStarted","Data":"e6f04c6534887e9403bca37cc0af0f286dec76b22277802c651b235d2d1f96ba"} Apr 20 11:43:21.727549 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.727526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c5cb569d4-mn8ck" event={"ID":"a7c6c9c5-9b0b-4d52-90db-f8193effb884","Type":"ContainerStarted","Data":"d0b602c8e48c708d85049bb34d555ed14ffe86fcb4bdae243fe08f72c073c694"} Apr 20 11:43:21.729232 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.729213 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-m82f2" Apr 20 11:43:21.734401 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:21.734352 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-m82f2" podStartSLOduration=1.630827442 podStartE2EDuration="17.734334984s" podCreationTimestamp="2026-04-20 11:43:04 +0000 UTC" firstStartedPulling="2026-04-20 11:43:04.761065354 +0000 UTC m=+70.884484598" lastFinishedPulling="2026-04-20 11:43:20.864572904 +0000 UTC m=+86.987992140" observedRunningTime="2026-04-20 11:43:21.733647906 +0000 UTC m=+87.857067154" watchObservedRunningTime="2026-04-20 11:43:21.734334984 +0000 UTC m=+87.857754231" Apr 20 11:43:22.735535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:22.735499 2578 generic.go:358] "Generic (PLEG): container finished" podID="9a4a187b-9bc0-45c2-b2f3-2f91c2357f40" containerID="471c068c34e8946050d0a060a20b3718fab30ca6e901866df2c15065731dc626" exitCode=0 Apr 20 11:43:22.736743 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:22.736332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vfzd7" event={"ID":"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40","Type":"ContainerDied","Data":"471c068c34e8946050d0a060a20b3718fab30ca6e901866df2c15065731dc626"} Apr 20 11:43:26.499116 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:26.499076 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c5cb569d4-mn8ck"] Apr 20 11:43:26.751832 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:26.751712 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerStarted","Data":"ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad"} Apr 20 11:43:26.754314 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:26.754277 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vfzd7" event={"ID":"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40","Type":"ContainerStarted","Data":"cbbf95456be8df53562426e9e17fd6c82b7d91ffb45cb32525a42f97355f66fb"} Apr 20 11:43:26.756035 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:26.755986 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" event={"ID":"345aa0c5-2e04-44b5-800d-435f23ed96de","Type":"ContainerStarted","Data":"401dedb83e29442ad4fb9bad9b3f8fbec9dc771541e68cf3d1c7eb91732c2902"} Apr 20 11:43:26.768655 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:26.768610 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zps6q" event={"ID":"46495978-0693-4293-ac30-560a6e13e86c","Type":"ContainerStarted","Data":"00d6708216f499ef1e80ab2727a81cea78b62053ec734dc53740d6908ca7933a"} Apr 20 11:43:26.775215 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:26.773946 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerStarted","Data":"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37"} Apr 20 11:43:26.792311 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:26.792230 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zps6q" podStartSLOduration=0.947669656 podStartE2EDuration="13.792211018s" podCreationTimestamp="2026-04-20 11:43:13 +0000 UTC" firstStartedPulling="2026-04-20 11:43:13.726919258 +0000 UTC m=+79.850338487" lastFinishedPulling="2026-04-20 11:43:26.571460611 +0000 UTC m=+92.694879849" observedRunningTime="2026-04-20 11:43:26.790848267 +0000 UTC m=+92.914267515" watchObservedRunningTime="2026-04-20 11:43:26.792211018 +0000 UTC m=+92.915630264" Apr 20 11:43:27.780953 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.780895 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c5cb569d4-mn8ck" event={"ID":"a7c6c9c5-9b0b-4d52-90db-f8193effb884","Type":"ContainerStarted","Data":"42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b"} Apr 20 11:43:27.783521 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.783485 2578 generic.go:358] "Generic (PLEG): container finished" podID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerID="ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad" exitCode=0 Apr 20 11:43:27.783675 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.783576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerDied","Data":"ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad"} Apr 20 11:43:27.789048 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.788589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vfzd7" event={"ID":"9a4a187b-9bc0-45c2-b2f3-2f91c2357f40","Type":"ContainerStarted","Data":"315a0281d42083655a5bf873d177b652a7966e6411e8c44ca3f677b2e4bcbf15"} Apr 20 11:43:27.791829 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.791800 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" event={"ID":"345aa0c5-2e04-44b5-800d-435f23ed96de","Type":"ContainerStarted","Data":"bbd96a5445c2e1c834deebf8acca279d5823e342e5fc585c2f3c624cb0540e80"} Apr 20 11:43:27.791938 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.791840 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" event={"ID":"345aa0c5-2e04-44b5-800d-435f23ed96de","Type":"ContainerStarted","Data":"048231d669087a4a6dc8783a7ed39e9d53127a003572971fcb4a179da4edc74e"} Apr 20 11:43:27.795067 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.794965 2578 generic.go:358] "Generic (PLEG): container finished" podID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerID="b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37" exitCode=0 Apr 20 11:43:27.795067 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.795020 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerDied","Data":"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37"} Apr 20 11:43:27.801376 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.801295 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c5cb569d4-mn8ck" podStartSLOduration=7.096483585 podStartE2EDuration="12.80127768s" podCreationTimestamp="2026-04-20 11:43:15 +0000 UTC" firstStartedPulling="2026-04-20 11:43:21.175948947 +0000 UTC m=+87.299368172" lastFinishedPulling="2026-04-20 11:43:26.880743024 +0000 UTC m=+93.004162267" observedRunningTime="2026-04-20 11:43:27.799134073 +0000 UTC m=+93.922553314" watchObservedRunningTime="2026-04-20 11:43:27.80127768 +0000 UTC m=+93.924696928" Apr 20 11:43:27.825012 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:27.824940 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vfzd7" podStartSLOduration=12.915068998 podStartE2EDuration="13.824920858s" podCreationTimestamp="2026-04-20 11:43:14 +0000 UTC" firstStartedPulling="2026-04-20 11:43:20.786796221 +0000 UTC m=+86.910215452" lastFinishedPulling="2026-04-20 11:43:21.696648086 +0000 UTC m=+87.820067312" observedRunningTime="2026-04-20 11:43:27.823066197 +0000 UTC m=+93.946485445" watchObservedRunningTime="2026-04-20 11:43:27.824920858 +0000 UTC m=+93.948340106" Apr 20 11:43:28.803391 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:28.803323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" event={"ID":"345aa0c5-2e04-44b5-800d-435f23ed96de","Type":"ContainerStarted","Data":"d914a9f0bcab1c8354d8f625fa09d2ed4680b72bea86e4c25037fa9d3d62fb97"} Apr 20 11:43:29.812549 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:29.811340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" event={"ID":"345aa0c5-2e04-44b5-800d-435f23ed96de","Type":"ContainerStarted","Data":"9c98f2edf29d1cc9822979323c714d0f0f84e0d4153823a39e35c42617581c9d"} Apr 20 11:43:29.812549 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:29.811412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" event={"ID":"345aa0c5-2e04-44b5-800d-435f23ed96de","Type":"ContainerStarted","Data":"bd9b530a88b8c989c1a888d47038c46c79507a4fba33da71188cce33c732ba34"} Apr 20 11:43:29.812549 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:29.812507 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:29.844374 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:29.842504 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" podStartSLOduration=6.162863397 podStartE2EDuration="13.842482487s" podCreationTimestamp="2026-04-20 11:43:16 +0000 UTC" firstStartedPulling="2026-04-20 11:43:20.941302395 +0000 UTC m=+87.064721620" lastFinishedPulling="2026-04-20 11:43:28.620921481 +0000 UTC m=+94.744340710" observedRunningTime="2026-04-20 11:43:29.840320977 +0000 UTC m=+95.963740227" watchObservedRunningTime="2026-04-20 11:43:29.842482487 +0000 UTC m=+95.965901736" Apr 20 11:43:31.819636 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:31.819600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerStarted","Data":"8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925"} Apr 20 11:43:31.827647 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:31.827610 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6685d9887c-vsnfc" Apr 20 11:43:32.827547 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.827511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerStarted","Data":"844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c"} Apr 20 11:43:32.828003 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.827560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerStarted","Data":"cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af"} Apr 20 11:43:32.828003 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.827575 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerStarted","Data":"5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0"} Apr 20 11:43:32.828003 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.827594 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerStarted","Data":"10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a"} Apr 20 11:43:32.828003 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.827607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerStarted","Data":"9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f"} Apr 20 11:43:32.831674 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.831649 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerStarted","Data":"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea"} Apr 20 11:43:32.831822 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.831686 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerStarted","Data":"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c"} Apr 20 11:43:32.831822 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.831700 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerStarted","Data":"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4"} Apr 20 11:43:32.831822 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.831712 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerStarted","Data":"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116"} Apr 20 11:43:32.831822 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.831724 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerStarted","Data":"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30"} Apr 20 11:43:32.853878 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:32.853783 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=7.447039382 podStartE2EDuration="17.853765206s" podCreationTimestamp="2026-04-20 11:43:15 +0000 UTC" firstStartedPulling="2026-04-20 11:43:21.196584248 +0000 UTC m=+87.320003476" lastFinishedPulling="2026-04-20 11:43:31.603310056 +0000 UTC m=+97.726729300" observedRunningTime="2026-04-20 11:43:32.853225052 +0000 UTC m=+98.976644312" watchObservedRunningTime="2026-04-20 11:43:32.853765206 +0000 UTC m=+98.977184455" Apr 20 11:43:33.837211 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:33.837168 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerStarted","Data":"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf"} Apr 20 11:43:33.866879 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:33.866807 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.234824267 podStartE2EDuration="13.86678507s" podCreationTimestamp="2026-04-20 11:43:20 +0000 UTC" firstStartedPulling="2026-04-20 11:43:21.423923175 +0000 UTC m=+87.547342401" lastFinishedPulling="2026-04-20 11:43:32.055883977 +0000 UTC m=+98.179303204" observedRunningTime="2026-04-20 11:43:33.864026149 +0000 UTC m=+99.987445396" watchObservedRunningTime="2026-04-20 11:43:33.86678507 +0000 UTC m=+99.990204316" Apr 20 11:43:35.959387 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:35.959341 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:36.242703 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:36.242617 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:43:52.809134 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:52.809069 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c5cb569d4-mn8ck" podUID="a7c6c9c5-9b0b-4d52-90db-f8193effb884" containerName="console" containerID="cri-o://42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b" gracePeriod=15 Apr 20 11:43:53.053758 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.053728 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c5cb569d4-mn8ck_a7c6c9c5-9b0b-4d52-90db-f8193effb884/console/0.log" Apr 20 11:43:53.053887 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.053793 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:53.165435 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.165322 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-serving-cert\") pod \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " Apr 20 11:43:53.165435 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.165379 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-trusted-ca-bundle\") pod \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " Apr 20 11:43:53.165670 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.165488 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-oauth-config\") pod \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " Apr 20 11:43:53.165670 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.165530 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6fs4\" (UniqueName: \"kubernetes.io/projected/a7c6c9c5-9b0b-4d52-90db-f8193effb884-kube-api-access-p6fs4\") pod \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " Apr 20 11:43:53.165670 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.165557 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-service-ca\") pod \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " Apr 20 11:43:53.165670 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.165615 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-config\") pod \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " Apr 20 11:43:53.165670 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.165661 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-oauth-serving-cert\") pod \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\" (UID: \"a7c6c9c5-9b0b-4d52-90db-f8193effb884\") " Apr 20 11:43:53.166040 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.165921 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a7c6c9c5-9b0b-4d52-90db-f8193effb884" (UID: "a7c6c9c5-9b0b-4d52-90db-f8193effb884"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:43:53.166040 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.166007 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-service-ca" (OuterVolumeSpecName: "service-ca") pod "a7c6c9c5-9b0b-4d52-90db-f8193effb884" (UID: "a7c6c9c5-9b0b-4d52-90db-f8193effb884"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:43:53.166150 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.166030 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-config" (OuterVolumeSpecName: "console-config") pod "a7c6c9c5-9b0b-4d52-90db-f8193effb884" (UID: "a7c6c9c5-9b0b-4d52-90db-f8193effb884"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:43:53.166150 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.166035 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a7c6c9c5-9b0b-4d52-90db-f8193effb884" (UID: "a7c6c9c5-9b0b-4d52-90db-f8193effb884"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:43:53.167958 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.167921 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a7c6c9c5-9b0b-4d52-90db-f8193effb884" (UID: "a7c6c9c5-9b0b-4d52-90db-f8193effb884"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:43:53.167958 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.167943 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a7c6c9c5-9b0b-4d52-90db-f8193effb884" (UID: "a7c6c9c5-9b0b-4d52-90db-f8193effb884"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:43:53.168098 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.167975 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c6c9c5-9b0b-4d52-90db-f8193effb884-kube-api-access-p6fs4" (OuterVolumeSpecName: "kube-api-access-p6fs4") pod "a7c6c9c5-9b0b-4d52-90db-f8193effb884" (UID: "a7c6c9c5-9b0b-4d52-90db-f8193effb884"). InnerVolumeSpecName "kube-api-access-p6fs4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:43:53.267009 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.266969 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-serving-cert\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:43:53.267009 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.267001 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-trusted-ca-bundle\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:43:53.267009 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.267012 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-oauth-config\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:43:53.267009 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.267021 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p6fs4\" (UniqueName: \"kubernetes.io/projected/a7c6c9c5-9b0b-4d52-90db-f8193effb884-kube-api-access-p6fs4\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:43:53.267305 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.267032 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-service-ca\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:43:53.267305 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.267041 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-console-config\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:43:53.267305 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.267049 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7c6c9c5-9b0b-4d52-90db-f8193effb884-oauth-serving-cert\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:43:53.896205 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.896176 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c5cb569d4-mn8ck_a7c6c9c5-9b0b-4d52-90db-f8193effb884/console/0.log" Apr 20 11:43:53.896615 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.896220 2578 generic.go:358] "Generic (PLEG): container finished" podID="a7c6c9c5-9b0b-4d52-90db-f8193effb884" containerID="42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b" exitCode=2 Apr 20 11:43:53.896615 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.896277 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c5cb569d4-mn8ck" event={"ID":"a7c6c9c5-9b0b-4d52-90db-f8193effb884","Type":"ContainerDied","Data":"42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b"} Apr 20 11:43:53.896615 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.896320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c5cb569d4-mn8ck" event={"ID":"a7c6c9c5-9b0b-4d52-90db-f8193effb884","Type":"ContainerDied","Data":"d0b602c8e48c708d85049bb34d555ed14ffe86fcb4bdae243fe08f72c073c694"} Apr 20 11:43:53.896615 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.896319 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c5cb569d4-mn8ck" Apr 20 11:43:53.896615 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.896397 2578 scope.go:117] "RemoveContainer" containerID="42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b" Apr 20 11:43:53.911987 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.911967 2578 scope.go:117] "RemoveContainer" containerID="42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b" Apr 20 11:43:53.912308 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:43:53.912284 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b\": container with ID starting with 42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b not found: ID does not exist" containerID="42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b" Apr 20 11:43:53.912409 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.912314 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b"} err="failed to get container status \"42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b\": rpc error: code = NotFound desc = could not find container \"42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b\": container with ID starting with 42795f020ea4ee37107c3880e4b7b819bbf3189134cb1258af1f8c687772b17b not found: ID does not exist" Apr 20 11:43:53.918622 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.918598 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c5cb569d4-mn8ck"] Apr 20 11:43:53.924549 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:53.924525 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c5cb569d4-mn8ck"] Apr 20 11:43:54.335865 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:43:54.335824 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c6c9c5-9b0b-4d52-90db-f8193effb884" path="/var/lib/kubelet/pods/a7c6c9c5-9b0b-4d52-90db-f8193effb884/volumes" Apr 20 11:44:21.242687 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:21.242645 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:21.259115 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:21.259088 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:21.998536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:21.998505 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:34.972135 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:34.972096 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 11:44:34.973085 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:34.972665 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="alertmanager" containerID="cri-o://8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925" gracePeriod=120 Apr 20 11:44:34.973085 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:34.972713 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy-metric" containerID="cri-o://cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af" gracePeriod=120 Apr 20 11:44:34.973085 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:34.972815 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="config-reloader" containerID="cri-o://9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f" gracePeriod=120 Apr 20 11:44:34.973085 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:34.972804 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy-web" containerID="cri-o://10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a" gracePeriod=120 Apr 20 11:44:34.973085 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:34.972929 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy" containerID="cri-o://5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0" gracePeriod=120 Apr 20 11:44:34.973085 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:34.972979 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="prom-label-proxy" containerID="cri-o://844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c" gracePeriod=120 Apr 20 11:44:36.025532 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025495 2578 generic.go:358] "Generic (PLEG): container finished" podID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerID="844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c" exitCode=0 Apr 20 11:44:36.025532 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025523 2578 generic.go:358] "Generic (PLEG): container finished" podID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerID="5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0" exitCode=0 Apr 20 11:44:36.025532 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025529 2578 generic.go:358] "Generic (PLEG): container finished" podID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerID="10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a" exitCode=0 Apr 20 11:44:36.025532 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025535 2578 generic.go:358] "Generic (PLEG): container finished" podID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerID="9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f" exitCode=0 Apr 20 11:44:36.025532 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025540 2578 generic.go:358] "Generic (PLEG): container finished" podID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerID="8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925" exitCode=0 Apr 20 11:44:36.026022 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerDied","Data":"844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c"} Apr 20 11:44:36.026022 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025603 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerDied","Data":"5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0"} Apr 20 11:44:36.026022 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerDied","Data":"10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a"} Apr 20 11:44:36.026022 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerDied","Data":"9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f"} Apr 20 11:44:36.026022 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.025632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerDied","Data":"8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925"} Apr 20 11:44:36.215981 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.215957 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:36.329316 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329209 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-web\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.329316 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329297 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-main-db\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.329632 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329322 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82ld4\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-kube-api-access-82ld4\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.329632 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329350 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-trusted-ca-bundle\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.329632 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329368 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-metrics-client-ca\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.329632 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329399 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-config-volume\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.329632 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329488 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-web-config\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.329632 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329550 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-tls-assets\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.329632 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329576 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-config-out\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.329632 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329617 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-main-tls\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.330169 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329658 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.330169 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329712 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-cluster-tls-config\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.330169 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329706 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 11:44:36.330169 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329746 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy\") pod \"49b3596d-08f0-4661-9c8c-dc64d845513f\" (UID: \"49b3596d-08f0-4661-9c8c-dc64d845513f\") " Apr 20 11:44:36.330169 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329806 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:36.330169 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.329810 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:36.330169 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.330038 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-main-db\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.330169 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.330067 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.330169 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.330083 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/49b3596d-08f0-4661-9c8c-dc64d845513f-metrics-client-ca\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.332671 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.332642 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-config-out" (OuterVolumeSpecName: "config-out") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 11:44:36.332960 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.332926 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:36.333125 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.333063 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:36.333640 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.333608 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-config-volume" (OuterVolumeSpecName: "config-volume") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:36.334269 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.333907 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:44:36.334475 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.334436 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:36.334475 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.334468 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:36.335159 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.335128 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-kube-api-access-82ld4" (OuterVolumeSpecName: "kube-api-access-82ld4") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "kube-api-access-82ld4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:44:36.337978 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.337946 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:36.344999 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.344979 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-web-config" (OuterVolumeSpecName: "web-config") pod "49b3596d-08f0-4661-9c8c-dc64d845513f" (UID: "49b3596d-08f0-4661-9c8c-dc64d845513f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:36.431017 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.430975 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82ld4\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-kube-api-access-82ld4\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.431017 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.431009 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-config-volume\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.431017 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.431020 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-web-config\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.431017 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.431028 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/49b3596d-08f0-4661-9c8c-dc64d845513f-tls-assets\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.431310 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.431036 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/49b3596d-08f0-4661-9c8c-dc64d845513f-config-out\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.431310 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.431045 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-main-tls\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.431310 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.431055 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.431310 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.431064 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-cluster-tls-config\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.431310 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.431074 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:36.431310 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:36.431084 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/49b3596d-08f0-4661-9c8c-dc64d845513f-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:37.031636 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.031597 2578 generic.go:358] "Generic (PLEG): container finished" podID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerID="cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af" exitCode=0 Apr 20 11:44:37.032046 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.031686 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerDied","Data":"cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af"} Apr 20 11:44:37.032046 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.031735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"49b3596d-08f0-4661-9c8c-dc64d845513f","Type":"ContainerDied","Data":"a1cc807cd5f22c4f90e185de6cf9fae4d36496f277894adc47024a397a4e9a4a"} Apr 20 11:44:37.032046 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.031757 2578 scope.go:117] "RemoveContainer" containerID="844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c" Apr 20 11:44:37.032046 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.031759 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.039970 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.039949 2578 scope.go:117] "RemoveContainer" containerID="cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af" Apr 20 11:44:37.047305 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.047288 2578 scope.go:117] "RemoveContainer" containerID="5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0" Apr 20 11:44:37.053847 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.053829 2578 scope.go:117] "RemoveContainer" containerID="10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a" Apr 20 11:44:37.061034 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.061017 2578 scope.go:117] "RemoveContainer" containerID="9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f" Apr 20 11:44:37.062437 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.062411 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 11:44:37.065978 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.065953 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 11:44:37.068709 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.068687 2578 scope.go:117] "RemoveContainer" containerID="8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925" Apr 20 11:44:37.075320 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.075297 2578 scope.go:117] "RemoveContainer" containerID="ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad" Apr 20 11:44:37.081799 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.081781 2578 scope.go:117] "RemoveContainer" containerID="844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c" Apr 20 11:44:37.082060 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:37.082030 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c\": container with ID starting with 844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c not found: ID does not exist" containerID="844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c" Apr 20 11:44:37.082118 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.082068 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c"} err="failed to get container status \"844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c\": rpc error: code = NotFound desc = could not find container \"844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c\": container with ID starting with 844e466cb95e5304fbc00f3af72267376009ffd5b48254c7dbc0dd3dbb86da2c not found: ID does not exist" Apr 20 11:44:37.082118 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.082091 2578 scope.go:117] "RemoveContainer" containerID="cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af" Apr 20 11:44:37.082360 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:37.082340 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af\": container with ID starting with cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af not found: ID does not exist" containerID="cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af" Apr 20 11:44:37.082471 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.082367 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af"} err="failed to get container status \"cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af\": rpc error: code = NotFound desc = could not find container \"cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af\": container with ID starting with cf16ae7b88be692686bdf61c421eead531ba22efac8a7bb6f84fc6159b5738af not found: ID does not exist" Apr 20 11:44:37.082471 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.082385 2578 scope.go:117] "RemoveContainer" containerID="5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0" Apr 20 11:44:37.082656 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:37.082637 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0\": container with ID starting with 5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0 not found: ID does not exist" containerID="5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0" Apr 20 11:44:37.082704 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.082662 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0"} err="failed to get container status \"5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0\": rpc error: code = NotFound desc = could not find container \"5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0\": container with ID starting with 5d17a13c65a70da299913f2220b45cce8d7344060d5224a92e17c03ef7f8eeb0 not found: ID does not exist" Apr 20 11:44:37.082704 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.082678 2578 scope.go:117] "RemoveContainer" containerID="10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a" Apr 20 11:44:37.082989 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:37.082963 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a\": container with ID starting with 10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a not found: ID does not exist" containerID="10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a" Apr 20 11:44:37.083076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.083014 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a"} err="failed to get container status \"10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a\": rpc error: code = NotFound desc = could not find container \"10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a\": container with ID starting with 10eb97ae90e2df22856d3349f182bcdd51179b245e52527215a2358414e69a8a not found: ID does not exist" Apr 20 11:44:37.083076 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.083038 2578 scope.go:117] "RemoveContainer" containerID="9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f" Apr 20 11:44:37.083363 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:37.083327 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f\": container with ID starting with 9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f not found: ID does not exist" containerID="9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f" Apr 20 11:44:37.083449 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.083363 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f"} err="failed to get container status \"9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f\": rpc error: code = NotFound desc = could not find container \"9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f\": container with ID starting with 9a12484cf4c6b6f23d7671de4215eae2ecc43c0187e453877372b75a70755b6f not found: ID does not exist" Apr 20 11:44:37.083449 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.083385 2578 scope.go:117] "RemoveContainer" containerID="8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925" Apr 20 11:44:37.083650 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:37.083625 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925\": container with ID starting with 8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925 not found: ID does not exist" containerID="8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925" Apr 20 11:44:37.083740 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.083658 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925"} err="failed to get container status \"8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925\": rpc error: code = NotFound desc = could not find container \"8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925\": container with ID starting with 8cfc52f6484c32b38bafd34546fae0379ec7e1d0a506d46bdb94765fc7216925 not found: ID does not exist" Apr 20 11:44:37.083740 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.083679 2578 scope.go:117] "RemoveContainer" containerID="ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad" Apr 20 11:44:37.083950 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:37.083934 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad\": container with ID starting with ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad not found: ID does not exist" containerID="ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad" Apr 20 11:44:37.083992 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.083954 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad"} err="failed to get container status \"ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad\": rpc error: code = NotFound desc = could not find container \"ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad\": container with ID starting with ace7437f90ccb45d785548cc49649e6bfd73c697c3c9f05f37bc661f4059dbad not found: ID does not exist" Apr 20 11:44:37.095872 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.095844 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 11:44:37.096144 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096133 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="config-reloader" Apr 20 11:44:37.096199 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096146 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="config-reloader" Apr 20 11:44:37.096199 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096154 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy-web" Apr 20 11:44:37.096199 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096171 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy-web" Apr 20 11:44:37.096199 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096183 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="prom-label-proxy" Apr 20 11:44:37.096199 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096188 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="prom-label-proxy" Apr 20 11:44:37.096199 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096196 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy" Apr 20 11:44:37.096199 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096202 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096212 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="alertmanager" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096218 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="alertmanager" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096225 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="init-config-reloader" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096230 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="init-config-reloader" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096236 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7c6c9c5-9b0b-4d52-90db-f8193effb884" containerName="console" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096270 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c6c9c5-9b0b-4d52-90db-f8193effb884" containerName="console" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096275 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy-metric" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096282 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy-metric" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096326 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy-web" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096333 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="config-reloader" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096339 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="prom-label-proxy" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096348 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096354 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="alertmanager" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096361 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7c6c9c5-9b0b-4d52-90db-f8193effb884" containerName="console" Apr 20 11:44:37.096436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.096368 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" containerName="kube-rbac-proxy-metric" Apr 20 11:44:37.100642 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.100620 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.103633 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.103612 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 11:44:37.103791 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.103775 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kt6q5\"" Apr 20 11:44:37.104035 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.103932 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 11:44:37.104035 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.103965 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 11:44:37.104035 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.103994 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 11:44:37.104035 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.103977 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 11:44:37.104035 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.104006 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 11:44:37.104359 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.104006 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 11:44:37.104422 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.104407 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 11:44:37.109741 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.109722 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 11:44:37.117405 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.117379 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 11:44:37.238561 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238561 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238566 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238784 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238633 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/919a1e65-0acf-4fa8-b70b-4a23c2897b98-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238784 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238686 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwp6v\" (UniqueName: \"kubernetes.io/projected/919a1e65-0acf-4fa8-b70b-4a23c2897b98-kube-api-access-xwp6v\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238784 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/919a1e65-0acf-4fa8-b70b-4a23c2897b98-tls-assets\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238784 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238935 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-config-volume\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238935 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-web-config\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238935 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/919a1e65-0acf-4fa8-b70b-4a23c2897b98-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238935 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/919a1e65-0acf-4fa8-b70b-4a23c2897b98-config-out\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.238935 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.239096 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238938 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.239096 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.238962 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/919a1e65-0acf-4fa8-b70b-4a23c2897b98-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340096 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.339989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/919a1e65-0acf-4fa8-b70b-4a23c2897b98-config-out\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340096 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340096 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340096 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/919a1e65-0acf-4fa8-b70b-4a23c2897b98-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340128 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/919a1e65-0acf-4fa8-b70b-4a23c2897b98-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340270 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwp6v\" (UniqueName: \"kubernetes.io/projected/919a1e65-0acf-4fa8-b70b-4a23c2897b98-kube-api-access-xwp6v\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/919a1e65-0acf-4fa8-b70b-4a23c2897b98-tls-assets\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-config-volume\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340415 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-web-config\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/919a1e65-0acf-4fa8-b70b-4a23c2897b98-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.340948 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.340767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/919a1e65-0acf-4fa8-b70b-4a23c2897b98-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.341154 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.341099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/919a1e65-0acf-4fa8-b70b-4a23c2897b98-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.342188 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.342126 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/919a1e65-0acf-4fa8-b70b-4a23c2897b98-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.343345 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.343303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.343776 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.343749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.343931 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.343882 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.344031 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.343951 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/919a1e65-0acf-4fa8-b70b-4a23c2897b98-config-out\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.344267 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.344228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/919a1e65-0acf-4fa8-b70b-4a23c2897b98-tls-assets\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.344363 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.344332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.344526 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.344509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-web-config\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.344959 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.344941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-config-volume\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.345429 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.345409 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/919a1e65-0acf-4fa8-b70b-4a23c2897b98-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.349954 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.349930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwp6v\" (UniqueName: \"kubernetes.io/projected/919a1e65-0acf-4fa8-b70b-4a23c2897b98-kube-api-access-xwp6v\") pod \"alertmanager-main-0\" (UID: \"919a1e65-0acf-4fa8-b70b-4a23c2897b98\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.410911 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.410871 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 11:44:37.545190 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:37.545162 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 11:44:37.546994 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:44:37.546958 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod919a1e65_0acf_4fa8_b70b_4a23c2897b98.slice/crio-22cb08a6596999f9bebbffc68c958c93297c5d9ead6c0e8558c1e0b64a180c86 WatchSource:0}: Error finding container 22cb08a6596999f9bebbffc68c958c93297c5d9ead6c0e8558c1e0b64a180c86: Status 404 returned error can't find the container with id 22cb08a6596999f9bebbffc68c958c93297c5d9ead6c0e8558c1e0b64a180c86 Apr 20 11:44:38.036986 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:38.036947 2578 generic.go:358] "Generic (PLEG): container finished" podID="919a1e65-0acf-4fa8-b70b-4a23c2897b98" containerID="da4a69885a80598633874f1333cc660733a3c62efadabb24dbf88eb659ca118f" exitCode=0 Apr 20 11:44:38.037361 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:38.037010 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"919a1e65-0acf-4fa8-b70b-4a23c2897b98","Type":"ContainerDied","Data":"da4a69885a80598633874f1333cc660733a3c62efadabb24dbf88eb659ca118f"} Apr 20 11:44:38.037361 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:38.037038 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"919a1e65-0acf-4fa8-b70b-4a23c2897b98","Type":"ContainerStarted","Data":"22cb08a6596999f9bebbffc68c958c93297c5d9ead6c0e8558c1e0b64a180c86"} Apr 20 11:44:38.336339 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:38.336305 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b3596d-08f0-4661-9c8c-dc64d845513f" path="/var/lib/kubelet/pods/49b3596d-08f0-4661-9c8c-dc64d845513f/volumes" Apr 20 11:44:39.045210 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.045172 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"919a1e65-0acf-4fa8-b70b-4a23c2897b98","Type":"ContainerStarted","Data":"8a371a66825c513ea4e9411796798bbcb25b420ce72dc7ffe2ae7810015a8993"} Apr 20 11:44:39.045210 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.045210 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"919a1e65-0acf-4fa8-b70b-4a23c2897b98","Type":"ContainerStarted","Data":"e88322da997aa704947ca8798e1aa6c4133bc53a27c53ed2903186dc530cd826"} Apr 20 11:44:39.045617 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.045219 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"919a1e65-0acf-4fa8-b70b-4a23c2897b98","Type":"ContainerStarted","Data":"a47dd4e0477e3a880cd914f78adce86e4f97f4485bb3051829f71e3c6dea7b08"} Apr 20 11:44:39.045617 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.045228 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"919a1e65-0acf-4fa8-b70b-4a23c2897b98","Type":"ContainerStarted","Data":"5fcecbdc00e4ce58b3340a077e90cab9c56d79d4440496914ce418ae276d90e7"} Apr 20 11:44:39.045617 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.045236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"919a1e65-0acf-4fa8-b70b-4a23c2897b98","Type":"ContainerStarted","Data":"b0f0eb97c1eea862068df78a794b171720203df972a2cee01785021e0e713e0c"} Apr 20 11:44:39.045617 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.045262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"919a1e65-0acf-4fa8-b70b-4a23c2897b98","Type":"ContainerStarted","Data":"203a0e3ed4e6fcaeb2b958ef64ae604db42ee12d08497af1ea9355741bd988a5"} Apr 20 11:44:39.074200 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.074130 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.074110523 podStartE2EDuration="2.074110523s" podCreationTimestamp="2026-04-20 11:44:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:44:39.071985805 +0000 UTC m=+165.195405052" watchObservedRunningTime="2026-04-20 11:44:39.074110523 +0000 UTC m=+165.197529771" Apr 20 11:44:39.195610 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.195573 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 11:44:39.196438 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.196377 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="prometheus" containerID="cri-o://bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" gracePeriod=600 Apr 20 11:44:39.196674 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.196403 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy-web" containerID="cri-o://2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" gracePeriod=600 Apr 20 11:44:39.196800 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.196411 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy-thanos" containerID="cri-o://56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" gracePeriod=600 Apr 20 11:44:39.196862 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.196438 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="config-reloader" containerID="cri-o://a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" gracePeriod=600 Apr 20 11:44:39.196862 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.196440 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="thanos-sidecar" containerID="cri-o://a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" gracePeriod=600 Apr 20 11:44:39.196948 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.196435 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy" containerID="cri-o://071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" gracePeriod=600 Apr 20 11:44:39.450551 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.450526 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:39.563490 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563401 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-grpc-tls\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563490 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563452 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-metrics-client-ca\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563490 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563473 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config-out\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563497 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563521 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-tls\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563545 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-thanos-prometheus-http-client-file\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563576 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-web-config\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563619 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-tls-assets\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563650 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-db\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563676 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-kubelet-serving-ca-bundle\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563705 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-rulefiles-0\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.563737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563738 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.564165 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563773 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-metrics-client-certs\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.564165 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563810 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sxqt\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-kube-api-access-9sxqt\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.564165 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563852 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.564165 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563890 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-kube-rbac-proxy\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.564165 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563893 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:39.564165 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563938 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-serving-certs-ca-bundle\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.564165 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.563989 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-trusted-ca-bundle\") pod \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\" (UID: \"232fa66a-5df6-4947-8ac8-51c68ea12a9c\") " Apr 20 11:44:39.564718 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.564468 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-metrics-client-ca\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.565697 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.564961 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:39.565697 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.565142 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:39.565697 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.565614 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:39.566006 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.565910 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 11:44:39.566548 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.566481 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:39.567161 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.567122 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 11:44:39.567291 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.567214 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:39.567555 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.567524 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:39.567637 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.567563 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config-out" (OuterVolumeSpecName: "config-out") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 11:44:39.567845 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.567802 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:39.568215 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.568192 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config" (OuterVolumeSpecName: "config") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:39.568750 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.568727 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:44:39.569099 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.569071 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-kube-api-access-9sxqt" (OuterVolumeSpecName: "kube-api-access-9sxqt") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "kube-api-access-9sxqt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 11:44:39.569175 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.569129 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:39.569175 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.569141 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:39.569274 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.569261 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:39.579294 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.579270 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-web-config" (OuterVolumeSpecName: "web-config") pod "232fa66a-5df6-4947-8ac8-51c68ea12a9c" (UID: "232fa66a-5df6-4947-8ac8-51c68ea12a9c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 11:44:39.665468 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665428 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-db\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665468 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665464 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665468 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665476 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665486 2578 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665496 2578 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-metrics-client-certs\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665504 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9sxqt\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-kube-api-access-9sxqt\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665515 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665525 2578 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-kube-rbac-proxy\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665534 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665543 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/232fa66a-5df6-4947-8ac8-51c68ea12a9c-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665551 2578 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-grpc-tls\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665560 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/232fa66a-5df6-4947-8ac8-51c68ea12a9c-config-out\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665569 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665579 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665587 2578 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665596 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/232fa66a-5df6-4947-8ac8-51c68ea12a9c-web-config\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:39.665692 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:39.665605 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/232fa66a-5df6-4947-8ac8-51c68ea12a9c-tls-assets\") on node \"ip-10-0-133-125.ec2.internal\" DevicePath \"\"" Apr 20 11:44:40.051894 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.051861 2578 generic.go:358] "Generic (PLEG): container finished" podID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerID="56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" exitCode=0 Apr 20 11:44:40.051894 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.051891 2578 generic.go:358] "Generic (PLEG): container finished" podID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerID="071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" exitCode=0 Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.051901 2578 generic.go:358] "Generic (PLEG): container finished" podID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerID="2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" exitCode=0 Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.051912 2578 generic.go:358] "Generic (PLEG): container finished" podID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerID="a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" exitCode=0 Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.051920 2578 generic.go:358] "Generic (PLEG): container finished" podID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerID="a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" exitCode=0 Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.051927 2578 generic.go:358] "Generic (PLEG): container finished" podID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerID="bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" exitCode=0 Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.051946 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerDied","Data":"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf"} Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.051990 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.052002 2578 scope.go:117] "RemoveContainer" containerID="56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.051989 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerDied","Data":"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea"} Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.052134 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerDied","Data":"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c"} Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.052153 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerDied","Data":"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4"} Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.052177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerDied","Data":"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116"} Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.052192 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerDied","Data":"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30"} Apr 20 11:44:40.052341 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.052207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"232fa66a-5df6-4947-8ac8-51c68ea12a9c","Type":"ContainerDied","Data":"e6f04c6534887e9403bca37cc0af0f286dec76b22277802c651b235d2d1f96ba"} Apr 20 11:44:40.060496 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.060480 2578 scope.go:117] "RemoveContainer" containerID="071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" Apr 20 11:44:40.067737 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.067720 2578 scope.go:117] "RemoveContainer" containerID="2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" Apr 20 11:44:40.078931 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.076995 2578 scope.go:117] "RemoveContainer" containerID="a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" Apr 20 11:44:40.079588 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.079557 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 11:44:40.082747 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.082720 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 11:44:40.091352 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.091329 2578 scope.go:117] "RemoveContainer" containerID="a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" Apr 20 11:44:40.098373 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.098334 2578 scope.go:117] "RemoveContainer" containerID="bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" Apr 20 11:44:40.105436 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.105417 2578 scope.go:117] "RemoveContainer" containerID="b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37" Apr 20 11:44:40.109668 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.109643 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 11:44:40.110112 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110079 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy-web" Apr 20 11:44:40.110112 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110100 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy-web" Apr 20 11:44:40.110112 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110115 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="init-config-reloader" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110124 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="init-config-reloader" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110135 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy-thanos" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110143 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy-thanos" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110161 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="prometheus" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110169 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="prometheus" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110183 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110191 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110213 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="thanos-sidecar" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110221 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="thanos-sidecar" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110234 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="config-reloader" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110260 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="config-reloader" Apr 20 11:44:40.110334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110327 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy-web" Apr 20 11:44:40.110652 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110343 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="thanos-sidecar" Apr 20 11:44:40.110652 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110353 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy-thanos" Apr 20 11:44:40.110652 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110363 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="kube-rbac-proxy" Apr 20 11:44:40.110652 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110375 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="config-reloader" Apr 20 11:44:40.110652 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.110385 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" containerName="prometheus" Apr 20 11:44:40.113053 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.113037 2578 scope.go:117] "RemoveContainer" containerID="56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" Apr 20 11:44:40.113333 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:40.113312 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": container with ID starting with 56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf not found: ID does not exist" containerID="56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" Apr 20 11:44:40.113378 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.113343 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf"} err="failed to get container status \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": rpc error: code = NotFound desc = could not find container \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": container with ID starting with 56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf not found: ID does not exist" Apr 20 11:44:40.113378 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.113361 2578 scope.go:117] "RemoveContainer" containerID="071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" Apr 20 11:44:40.113597 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:40.113576 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": container with ID starting with 071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea not found: ID does not exist" containerID="071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" Apr 20 11:44:40.113661 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.113625 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea"} err="failed to get container status \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": rpc error: code = NotFound desc = could not find container \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": container with ID starting with 071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea not found: ID does not exist" Apr 20 11:44:40.113661 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.113651 2578 scope.go:117] "RemoveContainer" containerID="2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" Apr 20 11:44:40.113906 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:40.113891 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": container with ID starting with 2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c not found: ID does not exist" containerID="2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" Apr 20 11:44:40.113949 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.113909 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c"} err="failed to get container status \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": rpc error: code = NotFound desc = could not find container \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": container with ID starting with 2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c not found: ID does not exist" Apr 20 11:44:40.113949 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.113922 2578 scope.go:117] "RemoveContainer" containerID="a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" Apr 20 11:44:40.114142 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:40.114125 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": container with ID starting with a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4 not found: ID does not exist" containerID="a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" Apr 20 11:44:40.114184 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.114147 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4"} err="failed to get container status \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": rpc error: code = NotFound desc = could not find container \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": container with ID starting with a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4 not found: ID does not exist" Apr 20 11:44:40.114184 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.114162 2578 scope.go:117] "RemoveContainer" containerID="a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" Apr 20 11:44:40.114444 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:40.114424 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": container with ID starting with a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116 not found: ID does not exist" containerID="a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" Apr 20 11:44:40.114514 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.114446 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116"} err="failed to get container status \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": rpc error: code = NotFound desc = could not find container \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": container with ID starting with a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116 not found: ID does not exist" Apr 20 11:44:40.114514 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.114462 2578 scope.go:117] "RemoveContainer" containerID="bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" Apr 20 11:44:40.114698 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:40.114681 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": container with ID starting with bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30 not found: ID does not exist" containerID="bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" Apr 20 11:44:40.114767 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.114705 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30"} err="failed to get container status \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": rpc error: code = NotFound desc = could not find container \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": container with ID starting with bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30 not found: ID does not exist" Apr 20 11:44:40.114767 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.114725 2578 scope.go:117] "RemoveContainer" containerID="b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37" Apr 20 11:44:40.115028 ip-10-0-133-125 kubenswrapper[2578]: E0420 11:44:40.114997 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": container with ID starting with b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37 not found: ID does not exist" containerID="b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37" Apr 20 11:44:40.115110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.115034 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37"} err="failed to get container status \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": rpc error: code = NotFound desc = could not find container \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": container with ID starting with b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37 not found: ID does not exist" Apr 20 11:44:40.115110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.115055 2578 scope.go:117] "RemoveContainer" containerID="56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" Apr 20 11:44:40.115110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.115105 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.115566 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.115326 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf"} err="failed to get container status \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": rpc error: code = NotFound desc = could not find container \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": container with ID starting with 56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf not found: ID does not exist" Apr 20 11:44:40.115566 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.115351 2578 scope.go:117] "RemoveContainer" containerID="071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" Apr 20 11:44:40.115872 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.115814 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea"} err="failed to get container status \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": rpc error: code = NotFound desc = could not find container \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": container with ID starting with 071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea not found: ID does not exist" Apr 20 11:44:40.115872 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.115833 2578 scope.go:117] "RemoveContainer" containerID="2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" Apr 20 11:44:40.116095 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.116076 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c"} err="failed to get container status \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": rpc error: code = NotFound desc = could not find container \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": container with ID starting with 2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c not found: ID does not exist" Apr 20 11:44:40.116161 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.116094 2578 scope.go:117] "RemoveContainer" containerID="a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" Apr 20 11:44:40.116435 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.116412 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4"} err="failed to get container status \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": rpc error: code = NotFound desc = could not find container \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": container with ID starting with a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4 not found: ID does not exist" Apr 20 11:44:40.116528 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.116436 2578 scope.go:117] "RemoveContainer" containerID="a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" Apr 20 11:44:40.116676 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.116657 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116"} err="failed to get container status \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": rpc error: code = NotFound desc = could not find container \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": container with ID starting with a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116 not found: ID does not exist" Apr 20 11:44:40.116720 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.116678 2578 scope.go:117] "RemoveContainer" containerID="bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" Apr 20 11:44:40.116893 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.116870 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30"} err="failed to get container status \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": rpc error: code = NotFound desc = could not find container \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": container with ID starting with bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30 not found: ID does not exist" Apr 20 11:44:40.116932 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.116896 2578 scope.go:117] "RemoveContainer" containerID="b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37" Apr 20 11:44:40.117101 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.117083 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37"} err="failed to get container status \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": rpc error: code = NotFound desc = could not find container \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": container with ID starting with b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37 not found: ID does not exist" Apr 20 11:44:40.117171 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.117103 2578 scope.go:117] "RemoveContainer" containerID="56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" Apr 20 11:44:40.117347 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.117327 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf"} err="failed to get container status \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": rpc error: code = NotFound desc = could not find container \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": container with ID starting with 56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf not found: ID does not exist" Apr 20 11:44:40.117396 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.117348 2578 scope.go:117] "RemoveContainer" containerID="071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" Apr 20 11:44:40.117568 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.117544 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea"} err="failed to get container status \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": rpc error: code = NotFound desc = could not find container \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": container with ID starting with 071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea not found: ID does not exist" Apr 20 11:44:40.117616 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.117571 2578 scope.go:117] "RemoveContainer" containerID="2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" Apr 20 11:44:40.117802 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.117779 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c"} err="failed to get container status \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": rpc error: code = NotFound desc = could not find container \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": container with ID starting with 2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c not found: ID does not exist" Apr 20 11:44:40.117802 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.117802 2578 scope.go:117] "RemoveContainer" containerID="a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" Apr 20 11:44:40.118095 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118070 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4"} err="failed to get container status \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": rpc error: code = NotFound desc = could not find container \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": container with ID starting with a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4 not found: ID does not exist" Apr 20 11:44:40.118150 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118098 2578 scope.go:117] "RemoveContainer" containerID="a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" Apr 20 11:44:40.118394 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118363 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116"} err="failed to get container status \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": rpc error: code = NotFound desc = could not find container \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": container with ID starting with a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116 not found: ID does not exist" Apr 20 11:44:40.118486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118428 2578 scope.go:117] "RemoveContainer" containerID="bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" Apr 20 11:44:40.118486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118446 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-2hwpv\"" Apr 20 11:44:40.118486 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118376 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 11:44:40.118634 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118377 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-548tscpr96qlm\"" Apr 20 11:44:40.118634 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118379 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 11:44:40.118634 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118568 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 11:44:40.118769 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118634 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 11:44:40.118870 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118845 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30"} err="failed to get container status \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": rpc error: code = NotFound desc = could not find container \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": container with ID starting with bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30 not found: ID does not exist" Apr 20 11:44:40.118870 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118870 2578 scope.go:117] "RemoveContainer" containerID="b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37" Apr 20 11:44:40.118990 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118974 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 11:44:40.119044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.118994 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 11:44:40.119113 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.119092 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37"} err="failed to get container status \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": rpc error: code = NotFound desc = could not find container \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": container with ID starting with b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37 not found: ID does not exist" Apr 20 11:44:40.119174 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.119116 2578 scope.go:117] "RemoveContainer" containerID="56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" Apr 20 11:44:40.119439 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.119411 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf"} err="failed to get container status \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": rpc error: code = NotFound desc = could not find container \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": container with ID starting with 56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf not found: ID does not exist" Apr 20 11:44:40.119653 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.119441 2578 scope.go:117] "RemoveContainer" containerID="071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" Apr 20 11:44:40.119653 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.119517 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 11:44:40.119764 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.119689 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea"} err="failed to get container status \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": rpc error: code = NotFound desc = could not find container \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": container with ID starting with 071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea not found: ID does not exist" Apr 20 11:44:40.119764 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.119708 2578 scope.go:117] "RemoveContainer" containerID="2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" Apr 20 11:44:40.119867 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.119830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 11:44:40.120100 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.120082 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 11:44:40.120231 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.120128 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c"} err="failed to get container status \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": rpc error: code = NotFound desc = could not find container \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": container with ID starting with 2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c not found: ID does not exist" Apr 20 11:44:40.120364 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.120236 2578 scope.go:117] "RemoveContainer" containerID="a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" Apr 20 11:44:40.120364 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.120175 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 11:44:40.120562 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.120538 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4"} err="failed to get container status \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": rpc error: code = NotFound desc = could not find container \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": container with ID starting with a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4 not found: ID does not exist" Apr 20 11:44:40.120562 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.120561 2578 scope.go:117] "RemoveContainer" containerID="a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" Apr 20 11:44:40.120862 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.120836 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116"} err="failed to get container status \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": rpc error: code = NotFound desc = could not find container \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": container with ID starting with a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116 not found: ID does not exist" Apr 20 11:44:40.120922 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.120867 2578 scope.go:117] "RemoveContainer" containerID="bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" Apr 20 11:44:40.120922 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.120848 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 11:44:40.121227 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.121206 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30"} err="failed to get container status \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": rpc error: code = NotFound desc = could not find container \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": container with ID starting with bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30 not found: ID does not exist" Apr 20 11:44:40.121329 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.121229 2578 scope.go:117] "RemoveContainer" containerID="b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37" Apr 20 11:44:40.121633 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.121607 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37"} err="failed to get container status \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": rpc error: code = NotFound desc = could not find container \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": container with ID starting with b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37 not found: ID does not exist" Apr 20 11:44:40.121633 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.121630 2578 scope.go:117] "RemoveContainer" containerID="56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" Apr 20 11:44:40.121887 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.121859 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf"} err="failed to get container status \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": rpc error: code = NotFound desc = could not find container \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": container with ID starting with 56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf not found: ID does not exist" Apr 20 11:44:40.121964 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.121889 2578 scope.go:117] "RemoveContainer" containerID="071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" Apr 20 11:44:40.122198 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.122169 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea"} err="failed to get container status \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": rpc error: code = NotFound desc = could not find container \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": container with ID starting with 071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea not found: ID does not exist" Apr 20 11:44:40.122198 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.122191 2578 scope.go:117] "RemoveContainer" containerID="2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" Apr 20 11:44:40.122374 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.122264 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 11:44:40.122466 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.122444 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c"} err="failed to get container status \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": rpc error: code = NotFound desc = could not find container \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": container with ID starting with 2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c not found: ID does not exist" Apr 20 11:44:40.122522 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.122469 2578 scope.go:117] "RemoveContainer" containerID="a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" Apr 20 11:44:40.122753 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.122717 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4"} err="failed to get container status \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": rpc error: code = NotFound desc = could not find container \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": container with ID starting with a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4 not found: ID does not exist" Apr 20 11:44:40.122816 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.122755 2578 scope.go:117] "RemoveContainer" containerID="a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" Apr 20 11:44:40.123003 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.122977 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116"} err="failed to get container status \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": rpc error: code = NotFound desc = could not find container \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": container with ID starting with a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116 not found: ID does not exist" Apr 20 11:44:40.123099 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.123004 2578 scope.go:117] "RemoveContainer" containerID="bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" Apr 20 11:44:40.123269 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.123222 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30"} err="failed to get container status \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": rpc error: code = NotFound desc = could not find container \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": container with ID starting with bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30 not found: ID does not exist" Apr 20 11:44:40.123334 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.123271 2578 scope.go:117] "RemoveContainer" containerID="b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37" Apr 20 11:44:40.123535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.123515 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37"} err="failed to get container status \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": rpc error: code = NotFound desc = could not find container \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": container with ID starting with b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37 not found: ID does not exist" Apr 20 11:44:40.123592 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.123536 2578 scope.go:117] "RemoveContainer" containerID="56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf" Apr 20 11:44:40.123814 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.123788 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf"} err="failed to get container status \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": rpc error: code = NotFound desc = could not find container \"56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf\": container with ID starting with 56cae67d7a570aba9f961b97ead7ad3f7b6db5e946f66af58afdd84259552faf not found: ID does not exist" Apr 20 11:44:40.123887 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.123816 2578 scope.go:117] "RemoveContainer" containerID="071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea" Apr 20 11:44:40.124145 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.124121 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea"} err="failed to get container status \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": rpc error: code = NotFound desc = could not find container \"071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea\": container with ID starting with 071e7ba5b562f3a918d236786d5b021ed38ad6df6c26ef64d8d77d0e6e3beaea not found: ID does not exist" Apr 20 11:44:40.124145 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.124146 2578 scope.go:117] "RemoveContainer" containerID="2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c" Apr 20 11:44:40.124430 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.124406 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c"} err="failed to get container status \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": rpc error: code = NotFound desc = could not find container \"2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c\": container with ID starting with 2f454bbb5641d4c5fc3dd84e1842608f95811167170fe58d3a5cff300c11495c not found: ID does not exist" Apr 20 11:44:40.124492 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.124432 2578 scope.go:117] "RemoveContainer" containerID="a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4" Apr 20 11:44:40.124668 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.124644 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4"} err="failed to get container status \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": rpc error: code = NotFound desc = could not find container \"a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4\": container with ID starting with a490067a8921631bec7286ff6f70240a9b87b80b310b9d77b8f31b8662d4fce4 not found: ID does not exist" Apr 20 11:44:40.124668 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.124666 2578 scope.go:117] "RemoveContainer" containerID="a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116" Apr 20 11:44:40.125236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.124938 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116"} err="failed to get container status \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": rpc error: code = NotFound desc = could not find container \"a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116\": container with ID starting with a36b6464bfce87d7123fe55ea60501e356bd7bfe18fe369e180d8e1b1b1f7116 not found: ID does not exist" Apr 20 11:44:40.125236 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.124965 2578 scope.go:117] "RemoveContainer" containerID="bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30" Apr 20 11:44:40.125496 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.125279 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30"} err="failed to get container status \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": rpc error: code = NotFound desc = could not find container \"bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30\": container with ID starting with bcf710fef72dd9b37f197e4fc3aab0e7f23509ff41deb673d42a5a7a26366e30 not found: ID does not exist" Apr 20 11:44:40.125496 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.125304 2578 scope.go:117] "RemoveContainer" containerID="b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37" Apr 20 11:44:40.125794 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.125593 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37"} err="failed to get container status \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": rpc error: code = NotFound desc = could not find container \"b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37\": container with ID starting with b9212ce117d3d95ebb27ba9ec8d539be0ec6c607c80a719906d074a3a90e6b37 not found: ID does not exist" Apr 20 11:44:40.127915 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.127892 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 11:44:40.129589 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.129569 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 11:44:40.170559 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.170524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.170559 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.170560 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.170804 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.170579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.170804 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.170684 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.170804 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.170714 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.170804 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.170741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.170804 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.170764 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/889c7ab2-1391-40cc-8abf-56494618095b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171063 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.170851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgvrr\" (UniqueName: \"kubernetes.io/projected/889c7ab2-1391-40cc-8abf-56494618095b-kube-api-access-mgvrr\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171063 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.170901 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/889c7ab2-1391-40cc-8abf-56494618095b-config-out\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171063 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.171001 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/889c7ab2-1391-40cc-8abf-56494618095b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171063 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.171026 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171210 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.171157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171276 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.171234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171338 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.171320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-config\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171392 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.171351 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171441 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.171397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171441 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.171422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.171536 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.171442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-web-config\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272363 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272571 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272571 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272680 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272680 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272598 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272680 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272615 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272680 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/889c7ab2-1391-40cc-8abf-56494618095b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272873 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgvrr\" (UniqueName: \"kubernetes.io/projected/889c7ab2-1391-40cc-8abf-56494618095b-kube-api-access-mgvrr\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272873 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272731 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/889c7ab2-1391-40cc-8abf-56494618095b-config-out\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272873 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/889c7ab2-1391-40cc-8abf-56494618095b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272873 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272793 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272873 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.272873 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.273153 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272898 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-config\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.273153 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272942 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.273153 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.273153 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.272984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/889c7ab2-1391-40cc-8abf-56494618095b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.273153 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.273010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.273435 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.273266 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.273435 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.273411 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.273707 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.273682 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.275822 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.275794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.275919 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.275794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.276530 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.276276 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.276530 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.276412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-config\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.276530 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.276487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.276729 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.276536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-web-config\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.276729 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.276548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.277390 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.276888 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.277745 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.277715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.278074 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.278054 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/889c7ab2-1391-40cc-8abf-56494618095b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.278523 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.278498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/889c7ab2-1391-40cc-8abf-56494618095b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.278746 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.278725 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.278902 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.278879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/889c7ab2-1391-40cc-8abf-56494618095b-config-out\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.279110 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.279093 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/889c7ab2-1391-40cc-8abf-56494618095b-web-config\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.290062 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.290038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgvrr\" (UniqueName: \"kubernetes.io/projected/889c7ab2-1391-40cc-8abf-56494618095b-kube-api-access-mgvrr\") pod \"prometheus-k8s-0\" (UID: \"889c7ab2-1391-40cc-8abf-56494618095b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.335170 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.335094 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="232fa66a-5df6-4947-8ac8-51c68ea12a9c" path="/var/lib/kubelet/pods/232fa66a-5df6-4947-8ac8-51c68ea12a9c/volumes" Apr 20 11:44:40.427697 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.427662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:44:40.563838 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:40.563809 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 11:44:40.566216 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:44:40.566186 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod889c7ab2_1391_40cc_8abf_56494618095b.slice/crio-d51d9a76951dd009288ddf3e53682de9b77b53e1ce26fd06c1c037ffd302ab61 WatchSource:0}: Error finding container d51d9a76951dd009288ddf3e53682de9b77b53e1ce26fd06c1c037ffd302ab61: Status 404 returned error can't find the container with id d51d9a76951dd009288ddf3e53682de9b77b53e1ce26fd06c1c037ffd302ab61 Apr 20 11:44:41.057592 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:41.057562 2578 generic.go:358] "Generic (PLEG): container finished" podID="889c7ab2-1391-40cc-8abf-56494618095b" containerID="7f37231e66053e3145d3b839f3bf782ee81a2c72faaaff8e24b70bbcf2671400" exitCode=0 Apr 20 11:44:41.058010 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:41.057648 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"889c7ab2-1391-40cc-8abf-56494618095b","Type":"ContainerDied","Data":"7f37231e66053e3145d3b839f3bf782ee81a2c72faaaff8e24b70bbcf2671400"} Apr 20 11:44:41.058010 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:41.057690 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"889c7ab2-1391-40cc-8abf-56494618095b","Type":"ContainerStarted","Data":"d51d9a76951dd009288ddf3e53682de9b77b53e1ce26fd06c1c037ffd302ab61"} Apr 20 11:44:42.063752 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:42.063712 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"889c7ab2-1391-40cc-8abf-56494618095b","Type":"ContainerStarted","Data":"df164fd8663416f1a7f3b2b121fd1e502cc60cb7d32231c51fbd1795efccb5ef"} Apr 20 11:44:42.063752 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:42.063758 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"889c7ab2-1391-40cc-8abf-56494618095b","Type":"ContainerStarted","Data":"04f946e3befbe89ded84c97f4e3027a4ff238eac78bbc3a167431fef18b1e602"} Apr 20 11:44:42.064367 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:42.063768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"889c7ab2-1391-40cc-8abf-56494618095b","Type":"ContainerStarted","Data":"aee81a4e63657484e66f69b1d4b1c5b2ea46bde1ff2c72fe9236d205d00df1c6"} Apr 20 11:44:42.064367 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:42.063778 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"889c7ab2-1391-40cc-8abf-56494618095b","Type":"ContainerStarted","Data":"90ffaa980feda02e99bf1b5878a444597fd3c6b8084ca25f251f2763cad0998d"} Apr 20 11:44:42.064367 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:42.063787 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"889c7ab2-1391-40cc-8abf-56494618095b","Type":"ContainerStarted","Data":"6bce24355f0fc13fde031f542b8b57a09b10299f6773dee954a6bbea5821a754"} Apr 20 11:44:42.064367 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:42.063795 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"889c7ab2-1391-40cc-8abf-56494618095b","Type":"ContainerStarted","Data":"5945c8fa7c71ebddc770a36d729461ffb4227f223ec9a0580104dba76df82f42"} Apr 20 11:44:42.094871 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:42.094823 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.094809772 podStartE2EDuration="2.094809772s" podCreationTimestamp="2026-04-20 11:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:44:42.093023357 +0000 UTC m=+168.216442627" watchObservedRunningTime="2026-04-20 11:44:42.094809772 +0000 UTC m=+168.218229018" Apr 20 11:44:45.427970 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:44:45.427934 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:45:40.428131 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:45:40.428088 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:45:40.444303 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:45:40.444272 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:45:41.257669 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:45:41.257642 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 11:46:24.325280 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:24.325215 2578 ???:1] "http: TLS handshake error from 10.0.133.125:38608: EOF" Apr 20 11:46:24.326050 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:24.326030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vn8x4_bc1a0b4c-542c-4194-8902-65ea34abd811/global-pull-secret-syncer/0.log" Apr 20 11:46:24.394896 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:24.394868 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nxf6x_f7fd4abe-2e90-42e4-b4e1-6d43241cd39a/konnectivity-agent/0.log" Apr 20 11:46:24.440800 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:24.440766 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-125.ec2.internal_f84e17a26108ba2d3be8bac3d44320a4/haproxy/0.log" Apr 20 11:46:27.737823 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:27.737743 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_919a1e65-0acf-4fa8-b70b-4a23c2897b98/alertmanager/0.log" Apr 20 11:46:27.768893 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:27.768866 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_919a1e65-0acf-4fa8-b70b-4a23c2897b98/config-reloader/0.log" Apr 20 11:46:27.793091 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:27.793066 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_919a1e65-0acf-4fa8-b70b-4a23c2897b98/kube-rbac-proxy-web/0.log" Apr 20 11:46:27.828503 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:27.828477 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_919a1e65-0acf-4fa8-b70b-4a23c2897b98/kube-rbac-proxy/0.log" Apr 20 11:46:27.858064 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:27.858036 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_919a1e65-0acf-4fa8-b70b-4a23c2897b98/kube-rbac-proxy-metric/0.log" Apr 20 11:46:27.882386 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:27.882358 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_919a1e65-0acf-4fa8-b70b-4a23c2897b98/prom-label-proxy/0.log" Apr 20 11:46:27.903817 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:27.903782 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_919a1e65-0acf-4fa8-b70b-4a23c2897b98/init-config-reloader/0.log" Apr 20 11:46:28.272377 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.272348 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vfzd7_9a4a187b-9bc0-45c2-b2f3-2f91c2357f40/node-exporter/0.log" Apr 20 11:46:28.296955 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.296917 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vfzd7_9a4a187b-9bc0-45c2-b2f3-2f91c2357f40/kube-rbac-proxy/0.log" Apr 20 11:46:28.318805 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.318783 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vfzd7_9a4a187b-9bc0-45c2-b2f3-2f91c2357f40/init-textfile/0.log" Apr 20 11:46:28.418215 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.418181 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_889c7ab2-1391-40cc-8abf-56494618095b/prometheus/0.log" Apr 20 11:46:28.441313 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.441284 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_889c7ab2-1391-40cc-8abf-56494618095b/config-reloader/0.log" Apr 20 11:46:28.462059 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.462030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_889c7ab2-1391-40cc-8abf-56494618095b/thanos-sidecar/0.log" Apr 20 11:46:28.484666 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.484629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_889c7ab2-1391-40cc-8abf-56494618095b/kube-rbac-proxy-web/0.log" Apr 20 11:46:28.505573 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.505490 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_889c7ab2-1391-40cc-8abf-56494618095b/kube-rbac-proxy/0.log" Apr 20 11:46:28.525385 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.525355 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_889c7ab2-1391-40cc-8abf-56494618095b/kube-rbac-proxy-thanos/0.log" Apr 20 11:46:28.546694 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.546663 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_889c7ab2-1391-40cc-8abf-56494618095b/init-config-reloader/0.log" Apr 20 11:46:28.578033 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.578007 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9rcnn_fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d/prometheus-operator/0.log" Apr 20 11:46:28.593539 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.593513 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9rcnn_fb6784c3-659d-48bd-a0c2-ff3dcb04ee3d/kube-rbac-proxy/0.log" Apr 20 11:46:28.774611 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.774581 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6685d9887c-vsnfc_345aa0c5-2e04-44b5-800d-435f23ed96de/thanos-query/0.log" Apr 20 11:46:28.807361 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.807326 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6685d9887c-vsnfc_345aa0c5-2e04-44b5-800d-435f23ed96de/kube-rbac-proxy-web/0.log" Apr 20 11:46:28.832514 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.832482 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6685d9887c-vsnfc_345aa0c5-2e04-44b5-800d-435f23ed96de/kube-rbac-proxy/0.log" Apr 20 11:46:28.853056 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.853029 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6685d9887c-vsnfc_345aa0c5-2e04-44b5-800d-435f23ed96de/prom-label-proxy/0.log" Apr 20 11:46:28.881422 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.881398 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6685d9887c-vsnfc_345aa0c5-2e04-44b5-800d-435f23ed96de/kube-rbac-proxy-rules/0.log" Apr 20 11:46:28.907675 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:28.907646 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6685d9887c-vsnfc_345aa0c5-2e04-44b5-800d-435f23ed96de/kube-rbac-proxy-metrics/0.log" Apr 20 11:46:30.254508 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.254476 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4mhc4_7a58fa4b-ae1b-451e-ab5e-4aaf7713ebdd/console-operator/0.log" Apr 20 11:46:30.611774 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.611745 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk"] Apr 20 11:46:30.615119 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.615102 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.619464 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.619434 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9p767\"/\"openshift-service-ca.crt\"" Apr 20 11:46:30.619598 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.619505 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9p767\"/\"kube-root-ca.crt\"" Apr 20 11:46:30.620774 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.620753 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9p767\"/\"default-dockercfg-dcczl\"" Apr 20 11:46:30.629386 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.629365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk"] Apr 20 11:46:30.640311 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.640287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-lib-modules\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.640417 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.640349 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-proc\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.640417 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.640376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-podres\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.640417 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.640400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v85vs\" (UniqueName: \"kubernetes.io/projected/949d5c9e-0d48-4380-8c7a-adb49211781f-kube-api-access-v85vs\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.640515 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.640419 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-sys\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.664283 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.664223 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-m82f2_ea955ee0-5a1d-4e72-bca6-f985586b22e1/download-server/0.log" Apr 20 11:46:30.741348 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.741313 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-lib-modules\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.741508 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.741402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-proc\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.741508 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.741433 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-podres\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.741508 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.741466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v85vs\" (UniqueName: \"kubernetes.io/projected/949d5c9e-0d48-4380-8c7a-adb49211781f-kube-api-access-v85vs\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.741508 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.741493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-sys\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.741675 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.741506 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-lib-modules\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.741675 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.741528 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-proc\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.741675 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.741566 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-podres\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.741675 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.741567 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/949d5c9e-0d48-4380-8c7a-adb49211781f-sys\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.751229 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.751202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v85vs\" (UniqueName: \"kubernetes.io/projected/949d5c9e-0d48-4380-8c7a-adb49211781f-kube-api-access-v85vs\") pod \"perf-node-gather-daemonset-hvvwk\" (UID: \"949d5c9e-0d48-4380-8c7a-adb49211781f\") " pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:30.924602 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:30.924494 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:31.046804 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:31.046779 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk"] Apr 20 11:46:31.049403 ip-10-0-133-125 kubenswrapper[2578]: W0420 11:46:31.049344 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod949d5c9e_0d48_4380_8c7a_adb49211781f.slice/crio-e1ef29510ddd7b08bee17a8aa9f2ab3a6e91545e60b47e51a4e9791088affe9d WatchSource:0}: Error finding container e1ef29510ddd7b08bee17a8aa9f2ab3a6e91545e60b47e51a4e9791088affe9d: Status 404 returned error can't find the container with id e1ef29510ddd7b08bee17a8aa9f2ab3a6e91545e60b47e51a4e9791088affe9d Apr 20 11:46:31.050005 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:31.049977 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-7qrlg_904766d7-451c-4378-8dbf-8486dbbd70e6/volume-data-source-validator/0.log" Apr 20 11:46:31.387072 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:31.387037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" event={"ID":"949d5c9e-0d48-4380-8c7a-adb49211781f","Type":"ContainerStarted","Data":"62bb441f4222d756b21ce947a49ef66b8b740275d385a6d065db3e3c3215f687"} Apr 20 11:46:31.387485 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:31.387081 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" event={"ID":"949d5c9e-0d48-4380-8c7a-adb49211781f","Type":"ContainerStarted","Data":"e1ef29510ddd7b08bee17a8aa9f2ab3a6e91545e60b47e51a4e9791088affe9d"} Apr 20 11:46:31.387485 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:31.387166 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:31.408404 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:31.408360 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" podStartSLOduration=1.408344555 podStartE2EDuration="1.408344555s" podCreationTimestamp="2026-04-20 11:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 11:46:31.406314872 +0000 UTC m=+277.529734119" watchObservedRunningTime="2026-04-20 11:46:31.408344555 +0000 UTC m=+277.531763876" Apr 20 11:46:31.806877 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:31.806844 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lvlmv_53d403af-81c7-4e78-8fe5-d31a6f123b4b/dns/0.log" Apr 20 11:46:31.831884 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:31.831854 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lvlmv_53d403af-81c7-4e78-8fe5-d31a6f123b4b/kube-rbac-proxy/0.log" Apr 20 11:46:31.891665 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:31.891636 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fq5mw_9c5f9f57-fb9e-4b09-a20c-38f4bd5c5552/dns-node-resolver/0.log" Apr 20 11:46:32.386545 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:32.386516 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ltzgg_8e78ca5b-c7fc-4c32-ae65-ccfc944fc66d/node-ca/0.log" Apr 20 11:46:33.086747 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:33.086716 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-69794ff49d-n7xjz_192a48bd-bcb6-4fec-9fa3-cd24f83284be/router/0.log" Apr 20 11:46:33.408299 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:33.408188 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8sqkc_e3fe8f50-9343-4a7e-8938-2d2334926942/serve-healthcheck-canary/0.log" Apr 20 11:46:33.926540 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:33.926510 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zps6q_46495978-0693-4293-ac30-560a6e13e86c/kube-rbac-proxy/0.log" Apr 20 11:46:33.951440 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:33.951410 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zps6q_46495978-0693-4293-ac30-560a6e13e86c/exporter/0.log" Apr 20 11:46:33.974280 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:33.974232 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zps6q_46495978-0693-4293-ac30-560a6e13e86c/extractor/0.log" Apr 20 11:46:37.400075 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:37.400051 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9p767/perf-node-gather-daemonset-hvvwk" Apr 20 11:46:37.831298 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:37.831266 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-khfpp_f7b624b2-8718-4b1f-9f76-0459cb6d4184/kube-storage-version-migrator-operator/0.log" Apr 20 11:46:38.866419 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:38.866389 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vg6zv_6276a0c3-e955-4691-b383-18751303b9e2/kube-multus-additional-cni-plugins/0.log" Apr 20 11:46:38.890511 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:38.890483 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vg6zv_6276a0c3-e955-4691-b383-18751303b9e2/egress-router-binary-copy/0.log" Apr 20 11:46:38.914083 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:38.914056 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vg6zv_6276a0c3-e955-4691-b383-18751303b9e2/cni-plugins/0.log" Apr 20 11:46:38.933889 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:38.933863 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vg6zv_6276a0c3-e955-4691-b383-18751303b9e2/bond-cni-plugin/0.log" Apr 20 11:46:38.956069 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:38.956047 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vg6zv_6276a0c3-e955-4691-b383-18751303b9e2/routeoverride-cni/0.log" Apr 20 11:46:38.975654 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:38.975633 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vg6zv_6276a0c3-e955-4691-b383-18751303b9e2/whereabouts-cni-bincopy/0.log" Apr 20 11:46:38.994044 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:38.994020 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vg6zv_6276a0c3-e955-4691-b383-18751303b9e2/whereabouts-cni/0.log" Apr 20 11:46:39.047897 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:39.047865 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s6chk_f6dd0225-09bd-4349-9632-48a466010b96/kube-multus/0.log" Apr 20 11:46:39.105664 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:39.105636 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4lcnh_d9165296-57f0-4590-ad83-189871356a1a/network-metrics-daemon/0.log" Apr 20 11:46:39.122535 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:39.122507 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4lcnh_d9165296-57f0-4590-ad83-189871356a1a/kube-rbac-proxy/0.log" Apr 20 11:46:39.965363 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:39.965236 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/ovn-controller/0.log" Apr 20 11:46:39.982413 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:39.982385 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/ovn-acl-logging/0.log" Apr 20 11:46:39.984508 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:39.984486 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/ovn-acl-logging/1.log" Apr 20 11:46:40.002515 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:40.002490 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/kube-rbac-proxy-node/0.log" Apr 20 11:46:40.020672 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:40.020641 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 11:46:40.037407 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:40.037381 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/northd/0.log" Apr 20 11:46:40.057128 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:40.057093 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/nbdb/0.log" Apr 20 11:46:40.076542 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:40.076513 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/sbdb/0.log" Apr 20 11:46:40.240662 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:40.240577 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8q26d_6933a359-bd42-4dcd-94d7-cc72b948509c/ovnkube-controller/0.log" Apr 20 11:46:41.872888 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:41.872860 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qzgrd_bc2ca7e3-e1d9-4855-90b5-3eb77ac6efea/network-check-target-container/0.log" Apr 20 11:46:42.633857 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:42.633819 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s8s8p_72feb6a6-4564-4f5a-a26a-008d43db43b7/iptables-alerter/0.log" Apr 20 11:46:43.259926 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:43.259894 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-spb5n_6316a3d4-5236-4574-91c5-ccd6e85aee53/tuned/0.log" Apr 20 11:46:46.325323 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:46.325288 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-zzv4z_f6bd444d-179f-465d-9358-90444a0bd1b0/csi-driver/0.log" Apr 20 11:46:46.347584 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:46.347561 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-zzv4z_f6bd444d-179f-465d-9358-90444a0bd1b0/csi-node-driver-registrar/0.log" Apr 20 11:46:46.366030 ip-10-0-133-125 kubenswrapper[2578]: I0420 11:46:46.366003 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-zzv4z_f6bd444d-179f-465d-9358-90444a0bd1b0/csi-liveness-probe/0.log"