Apr 17 08:02:39.102235 ip-10-0-138-63 systemd[1]: Starting Kubernetes Kubelet... Apr 17 08:02:39.632514 ip-10-0-138-63 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 08:02:39.632514 ip-10-0-138-63 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 08:02:39.632514 ip-10-0-138-63 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 08:02:39.632514 ip-10-0-138-63 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 08:02:39.632514 ip-10-0-138-63 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 08:02:39.634562 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.634471 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 08:02:39.642447 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642310 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:39.642447 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642443 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:39.642447 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642449 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:39.642447 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642454 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642458 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642463 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642467 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642474 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642478 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642482 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642486 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642491 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642495 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642499 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642503 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642507 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642511 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642514 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642518 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642522 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642525 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642530 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642534 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:39.642699 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642538 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642544 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642550 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642554 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642561 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642565 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642570 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642575 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642579 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642583 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642587 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642592 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642596 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642600 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642606 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642610 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642614 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642618 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642623 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642627 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:39.643508 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642632 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642636 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642641 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642645 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642649 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642654 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642658 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642662 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642666 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642673 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642680 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642685 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642690 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642695 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642699 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642704 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642708 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642712 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642717 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642722 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:39.644241 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642726 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642731 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642735 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642740 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642744 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642748 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642753 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642760 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642764 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642769 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642773 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642778 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642783 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642787 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642791 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642795 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642800 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642804 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642807 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:39.644722 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642812 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642816 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642820 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.642824 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643472 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643484 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643489 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643493 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643497 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643502 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643506 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643510 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643524 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643529 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643533 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643537 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643541 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643545 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643550 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:39.645224 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643554 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643559 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643563 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643568 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643572 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643576 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643581 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643587 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643591 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643596 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643600 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643604 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643609 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643613 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643617 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643621 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643625 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643629 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643633 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643637 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:39.645682 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643641 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643645 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643649 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643653 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643657 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643661 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643666 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643670 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643674 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643677 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643681 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643686 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643691 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643695 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643700 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643704 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643708 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643713 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643717 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:39.646280 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643723 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643730 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643736 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643744 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643749 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643754 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643758 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643763 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643767 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643772 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643777 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643781 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643785 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643790 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643794 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643798 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643802 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643806 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643810 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:39.647104 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643814 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643819 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643825 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643829 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643833 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643837 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643843 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643848 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643852 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643856 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643861 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643865 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.643869 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646090 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646108 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646121 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646128 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646136 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646142 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646148 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646155 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646160 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 08:02:39.647807 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646165 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646171 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646177 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646182 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646187 2570 flags.go:64] FLAG: --cgroup-root="" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646192 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646196 2570 flags.go:64] FLAG: --client-ca-file="" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646202 2570 flags.go:64] FLAG: --cloud-config="" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646206 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646212 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646218 2570 flags.go:64] FLAG: --cluster-domain="" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646222 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646228 2570 flags.go:64] FLAG: --config-dir="" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646233 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646238 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646244 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646250 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646256 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646262 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646266 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646272 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646277 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646281 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646286 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646293 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 08:02:39.648458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646297 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646302 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646306 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646312 2570 flags.go:64] FLAG: --enable-server="true" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646317 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646325 2570 flags.go:64] FLAG: --event-burst="100" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646330 2570 flags.go:64] FLAG: --event-qps="50" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646335 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646341 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646346 2570 flags.go:64] FLAG: --eviction-hard="" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646352 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646356 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646361 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646366 2570 flags.go:64] FLAG: --eviction-soft="" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646372 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646377 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646381 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646386 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646391 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646396 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646401 2570 flags.go:64] FLAG: --feature-gates="" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646407 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646412 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646417 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646423 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646428 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 17 08:02:39.649214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646433 2570 flags.go:64] FLAG: --help="false" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646438 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-138-63.ec2.internal" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646444 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646449 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646454 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646459 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646465 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646470 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646474 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646479 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646485 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646490 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646496 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646501 2570 flags.go:64] FLAG: --kube-reserved="" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646506 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646510 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646515 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646520 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646525 2570 flags.go:64] FLAG: --lock-file="" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646530 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646535 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646540 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646548 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 08:02:39.650084 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646553 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646558 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646563 2570 flags.go:64] FLAG: --logging-format="text" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646567 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646573 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646578 2570 flags.go:64] FLAG: --manifest-url="" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646583 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646591 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646596 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646603 2570 flags.go:64] FLAG: --max-pods="110" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646608 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646612 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646617 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646623 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646628 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646633 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646638 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646650 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646655 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646660 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646666 2570 flags.go:64] FLAG: --pod-cidr="" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646671 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646679 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646684 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 08:02:39.650686 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646689 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646694 2570 flags.go:64] FLAG: --port="10250" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646699 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646704 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02db409698314f0ff" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646709 2570 flags.go:64] FLAG: --qos-reserved="" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646714 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646719 2570 flags.go:64] FLAG: --register-node="true" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646724 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646729 2570 flags.go:64] FLAG: --register-with-taints="" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646736 2570 flags.go:64] FLAG: --registry-burst="10" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646740 2570 flags.go:64] FLAG: --registry-qps="5" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646745 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646750 2570 flags.go:64] FLAG: --reserved-memory="" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646756 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646761 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646766 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646771 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646776 2570 flags.go:64] FLAG: --runonce="false" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646781 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646786 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646791 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646796 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646801 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646806 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646812 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646817 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 08:02:39.651360 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646822 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646827 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646832 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646837 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646842 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646847 2570 flags.go:64] FLAG: --system-cgroups="" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646852 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646861 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646866 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646871 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646877 2570 flags.go:64] FLAG: --tls-min-version="" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646881 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646886 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646891 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646896 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646901 2570 flags.go:64] FLAG: --v="2" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646907 2570 flags.go:64] FLAG: --version="false" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646914 2570 flags.go:64] FLAG: --vmodule="" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646925 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.646930 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647102 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647109 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647114 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647118 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:39.652038 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647122 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647126 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647131 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647136 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647140 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647145 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647149 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647153 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647157 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647162 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647166 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647170 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647175 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647180 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647184 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647188 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647193 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647197 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647201 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647205 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:39.652604 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647209 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647213 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647217 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647221 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647226 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647230 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647236 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647241 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647247 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647253 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647257 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647262 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647266 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647270 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647275 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647279 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647284 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647288 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647294 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647298 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:39.653171 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647302 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647306 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647311 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647315 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647319 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647324 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647329 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647333 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647338 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647370 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647377 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647382 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647387 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647392 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647396 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647401 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647405 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647412 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647419 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:39.653704 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647424 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647428 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647433 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647437 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647442 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647446 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647451 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647455 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647459 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647465 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647470 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647473 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647477 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647482 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647486 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647491 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647495 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647499 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647503 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647508 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:39.654194 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647513 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:39.654687 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647526 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:39.654687 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.647530 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:39.654687 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.648197 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 08:02:39.656081 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.656060 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 08:02:39.656124 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.656083 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 08:02:39.656152 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656131 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:39.656152 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656136 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:39.656152 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656139 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:39.656152 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656143 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:39.656152 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656146 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:39.656152 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656150 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:39.656152 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656153 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656157 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656160 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656162 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656165 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656168 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656171 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656174 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656176 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656179 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656182 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656185 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656187 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656190 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656192 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656195 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656197 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656200 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656203 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656205 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:39.656327 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656208 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656210 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656214 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656218 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656221 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656224 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656227 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656230 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656232 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656235 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656237 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656240 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656242 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656245 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656248 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656251 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656254 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656257 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656259 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:39.656814 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656262 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656264 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656267 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656269 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656272 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656275 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656277 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656280 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656282 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656285 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656287 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656290 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656292 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656295 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656298 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656300 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656302 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656305 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656307 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:39.657376 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656310 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656313 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656316 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656318 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656322 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656326 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656329 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656331 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656335 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656338 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656341 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656343 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656346 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656348 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656351 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656354 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656357 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656359 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656362 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656365 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:39.657832 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656367 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656370 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.656375 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656469 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656473 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656476 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656479 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656482 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656484 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656487 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656490 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656493 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656496 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656500 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656503 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 08:02:39.658339 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656506 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656509 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656511 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656514 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656517 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656520 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656523 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656526 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656529 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656531 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656534 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656537 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656539 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656542 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656544 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656547 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656550 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656552 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656555 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656557 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 08:02:39.658715 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656560 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656562 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656565 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656567 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656570 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656573 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656575 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656578 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656580 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656583 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656585 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656589 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656591 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656594 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656597 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656601 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656603 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656607 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656610 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 08:02:39.659221 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656613 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656615 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656618 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656620 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656623 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656625 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656628 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656630 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656634 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656636 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656639 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656642 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656644 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656647 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656649 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656652 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656654 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656657 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656660 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 08:02:39.659695 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656662 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656665 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656667 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656670 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656674 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656677 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656680 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656683 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656685 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656687 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656691 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656693 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656696 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656699 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656702 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 17 08:02:39.660192 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:39.656705 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 08:02:39.660555 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.656710 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 08:02:39.660555 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.657556 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 08:02:39.660555 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.660141 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 08:02:39.661515 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.661503 2570 server.go:1019] "Starting client certificate rotation" Apr 17 08:02:39.661617 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.661600 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 08:02:39.661652 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.661643 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 08:02:39.695324 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.695306 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 08:02:39.698069 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.698052 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 08:02:39.718677 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.718655 2570 log.go:25] "Validated CRI v1 runtime API" Apr 17 08:02:39.720900 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.720880 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 08:02:39.725685 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.725671 2570 log.go:25] "Validated CRI v1 image API" Apr 17 08:02:39.728328 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.728312 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 08:02:39.732259 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.732233 2570 fs.go:135] Filesystem UUIDs: map[03174643-4861-4649-a941-a23e765af093:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a6f17e16-dcd8-454d-baca-837b150578ec:/dev/nvme0n1p3] Apr 17 08:02:39.732322 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.732259 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 08:02:39.738129 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.738023 2570 manager.go:217] Machine: {Timestamp:2026-04-17 08:02:39.73582678 +0000 UTC m=+0.493304199 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097082 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec296650ace3b8a6ef3e81ec3cb88ed1 SystemUUID:ec296650-ace3-b8a6-ef3e-81ec3cb88ed1 BootID:17629894-98db-4cf7-8914-6e2d4cf380b5 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a3:17:8d:34:29 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a3:17:8d:34:29 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:0f:db:d3:4a:9f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 08:02:39.738129 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.738128 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 08:02:39.738373 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.738248 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 08:02:39.741054 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.741027 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 08:02:39.741197 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.741057 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-63.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 08:02:39.741245 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.741204 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 08:02:39.741245 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.741213 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 08:02:39.741245 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.741226 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 08:02:39.741330 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.741245 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 08:02:39.742594 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.742578 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fn2lf" Apr 17 08:02:39.742641 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.742626 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 08:02:39.742742 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.742732 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 08:02:39.745041 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.745029 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 17 08:02:39.745077 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.745050 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 08:02:39.745822 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.745812 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 08:02:39.745860 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.745827 2570 kubelet.go:397] "Adding apiserver pod source" Apr 17 08:02:39.745860 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.745835 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 08:02:39.747075 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.747061 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 08:02:39.747131 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.747079 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 08:02:39.750717 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.750698 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fn2lf" Apr 17 08:02:39.750907 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.750886 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 08:02:39.752560 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.752543 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 08:02:39.754806 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754787 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754811 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754818 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754825 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754836 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754844 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754852 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754858 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754864 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754876 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754885 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 08:02:39.754895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.754893 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 08:02:39.758711 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.758676 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 08:02:39.758784 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.758719 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 08:02:39.761762 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.761737 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:39.761762 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.761699 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:39.764225 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.764210 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 08:02:39.764292 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.764251 2570 server.go:1295] "Started kubelet" Apr 17 08:02:39.764369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.764334 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 08:02:39.764454 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.764374 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 08:02:39.764454 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.764448 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 08:02:39.765110 ip-10-0-138-63 systemd[1]: Started Kubernetes Kubelet. Apr 17 08:02:39.766369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.766354 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 08:02:39.767159 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.767142 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-63.ec2.internal" not found Apr 17 08:02:39.767361 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.767349 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 17 08:02:39.771565 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.771545 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 08:02:39.772077 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.772058 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 08:02:39.772906 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.772886 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 08:02:39.772906 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.772908 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 08:02:39.773055 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:39.772993 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-63.ec2.internal\" not found" Apr 17 08:02:39.773055 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773017 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 17 08:02:39.773055 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773027 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 17 08:02:39.773191 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773059 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 08:02:39.773244 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773189 2570 factory.go:153] Registering CRI-O factory Apr 17 08:02:39.773295 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773247 2570 factory.go:223] Registration of the crio container factory successfully Apr 17 08:02:39.773345 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773310 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 08:02:39.773345 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773320 2570 factory.go:55] Registering systemd factory Apr 17 08:02:39.773345 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773327 2570 factory.go:223] Registration of the systemd container factory successfully Apr 17 08:02:39.773478 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773353 2570 factory.go:103] Registering Raw factory Apr 17 08:02:39.773478 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773386 2570 manager.go:1196] Started watching for new ooms in manager Apr 17 08:02:39.773820 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.773801 2570 manager.go:319] Starting recovery of all containers Apr 17 08:02:39.774507 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.774489 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:39.775205 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:39.775164 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 08:02:39.779234 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:39.779210 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-63.ec2.internal\" not found" node="ip-10-0-138-63.ec2.internal" Apr 17 08:02:39.782330 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.782304 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-63.ec2.internal" not found Apr 17 08:02:39.783164 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.783148 2570 manager.go:324] Recovery completed Apr 17 08:02:39.789102 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.789089 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:39.792451 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.792434 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-63.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:39.792519 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.792462 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:39.792519 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.792475 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-63.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:39.792931 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.792918 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 08:02:39.792982 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.792930 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 08:02:39.792982 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.792961 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 17 08:02:39.795649 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.795637 2570 policy_none.go:49] "None policy: Start" Apr 17 08:02:39.795685 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.795653 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 08:02:39.795685 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.795662 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 17 08:02:39.835497 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.835482 2570 manager.go:341] "Starting Device Plugin manager" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:39.835582 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.835604 2570 server.go:85] "Starting device plugin registration server" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.835820 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.835832 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.835959 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.836043 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.836053 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:39.836516 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:39.836546 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-63.ec2.internal\" not found" Apr 17 08:02:39.851460 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.841740 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-138-63.ec2.internal" not found Apr 17 08:02:39.880984 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.880953 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 08:02:39.882103 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.882085 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 08:02:39.882103 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.882107 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 08:02:39.882240 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.882122 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 08:02:39.882240 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.882128 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 08:02:39.882240 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:39.882158 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 08:02:39.885501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.885449 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:39.936801 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.936772 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 08:02:39.937801 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.937785 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-63.ec2.internal" event="NodeHasSufficientMemory" Apr 17 08:02:39.937861 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.937815 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 08:02:39.937861 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.937826 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-63.ec2.internal" event="NodeHasSufficientPID" Apr 17 08:02:39.937861 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.937849 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-63.ec2.internal" Apr 17 08:02:39.946297 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.946283 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-63.ec2.internal" Apr 17 08:02:39.946344 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:39.946303 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-63.ec2.internal\": node \"ip-10-0-138-63.ec2.internal\" not found" Apr 17 08:02:39.982380 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.982358 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal"] Apr 17 08:02:39.985117 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.985093 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal" Apr 17 08:02:39.985239 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:39.985224 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.007577 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.007558 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.011930 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.011916 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.016209 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.016194 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 08:02:40.021579 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.021566 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 08:02:40.174025 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.173960 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/23bf6540eb6d43d644d0a9f36f338fa6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal\" (UID: \"23bf6540eb6d43d644d0a9f36f338fa6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.174025 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.173987 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23bf6540eb6d43d644d0a9f36f338fa6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal\" (UID: \"23bf6540eb6d43d644d0a9f36f338fa6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.174025 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.174005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/19617c61026894db47a634e0cbb16491-config\") pod \"kube-apiserver-proxy-ip-10-0-138-63.ec2.internal\" (UID: \"19617c61026894db47a634e0cbb16491\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.274967 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.274927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/19617c61026894db47a634e0cbb16491-config\") pod \"kube-apiserver-proxy-ip-10-0-138-63.ec2.internal\" (UID: \"19617c61026894db47a634e0cbb16491\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.275091 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.274974 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/23bf6540eb6d43d644d0a9f36f338fa6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal\" (UID: \"23bf6540eb6d43d644d0a9f36f338fa6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.275091 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.275002 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23bf6540eb6d43d644d0a9f36f338fa6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal\" (UID: \"23bf6540eb6d43d644d0a9f36f338fa6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.275091 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.275038 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/19617c61026894db47a634e0cbb16491-config\") pod \"kube-apiserver-proxy-ip-10-0-138-63.ec2.internal\" (UID: \"19617c61026894db47a634e0cbb16491\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.275091 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.275044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/23bf6540eb6d43d644d0a9f36f338fa6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal\" (UID: \"23bf6540eb6d43d644d0a9f36f338fa6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.275091 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.275052 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23bf6540eb6d43d644d0a9f36f338fa6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal\" (UID: \"23bf6540eb6d43d644d0a9f36f338fa6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.319096 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.319074 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.324897 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.324882 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" Apr 17 08:02:40.660617 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.660536 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 08:02:40.661317 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.660707 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 08:02:40.661317 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.660707 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 08:02:40.661317 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.660716 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 08:02:40.746480 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.746443 2570 apiserver.go:52] "Watching apiserver" Apr 17 08:02:40.753023 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.752984 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 07:57:39 +0000 UTC" deadline="2028-01-10 17:59:50.444857678 +0000 UTC" Apr 17 08:02:40.753023 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.753019 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15201h57m9.691841685s" Apr 17 08:02:40.755875 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.755857 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 08:02:40.757276 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.757249 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal","openshift-multus/multus-additional-cni-plugins-xc6x7","openshift-multus/multus-krs9k","kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal","openshift-multus/network-metrics-daemon-x6js9","openshift-network-diagnostics/network-check-target-rkbs5","openshift-network-operator/iptables-alerter-sd8ch","openshift-ovn-kubernetes/ovnkube-node-nt4p9","kube-system/konnectivity-agent-7jkx8","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68","openshift-cluster-node-tuning-operator/tuned-2f72z","openshift-dns/node-resolver-cqc8h","openshift-image-registry/node-ca-skwnw"] Apr 17 08:02:40.759773 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.759752 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.760894 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.760877 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.762198 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.762175 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.762559 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.762540 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 08:02:40.762806 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.762791 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 08:02:40.763334 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.763318 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.763859 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.763844 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 08:02:40.763914 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.763877 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2wrzj\"" Apr 17 08:02:40.764219 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.764203 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 08:02:40.764270 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.764215 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 08:02:40.764270 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.764259 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lx6b2\"" Apr 17 08:02:40.764361 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.764266 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 08:02:40.764361 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.764319 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 08:02:40.764361 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.764336 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 08:02:40.764519 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.764507 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:02:40.764599 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.764585 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 08:02:40.765726 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.765704 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:40.765830 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.765724 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lv2z6\"" Apr 17 08:02:40.765830 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.765760 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 08:02:40.765830 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.765792 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 08:02:40.765830 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.765813 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 08:02:40.766085 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:40.765832 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:40.766410 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.766386 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:02:40.766410 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.766396 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 08:02:40.767145 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.767128 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.767378 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.767240 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bqp68\"" Apr 17 08:02:40.767378 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.767353 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 08:02:40.767606 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.767589 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 08:02:40.767677 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.767606 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ljsjb\"" Apr 17 08:02:40.768460 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.768440 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:40.768544 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:40.768525 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:40.769484 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.769467 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:02:40.769583 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.769497 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 08:02:40.769740 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.769726 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 08:02:40.769794 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.769732 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.769794 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.769776 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6lrwt\"" Apr 17 08:02:40.771031 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.771008 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.771672 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.771655 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 08:02:40.772181 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.772164 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p566c\"" Apr 17 08:02:40.772327 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.772316 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 08:02:40.772327 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.772318 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 08:02:40.772417 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.772381 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.772711 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.772695 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 08:02:40.773599 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.773580 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 08:02:40.773774 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.773747 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 08:02:40.773843 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.773782 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-d6vrr\"" Apr 17 08:02:40.775167 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.774953 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 08:02:40.775167 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.774968 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xm7h6\"" Apr 17 08:02:40.775322 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.775219 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 08:02:40.775322 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.775268 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 08:02:40.775464 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.775430 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 08:02:40.777348 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777330 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-host\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.777451 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777357 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-etc-selinux\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.777451 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777378 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7279n\" (UniqueName: \"kubernetes.io/projected/d5a576c4-fd46-48a0-9584-c6849f6fca38-kube-api-access-7279n\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.777451 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777402 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-systemd-units\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.777451 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777425 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-slash\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.777644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777458 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-cni-netd\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.777644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777483 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-var-lib-cni-multus\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.777644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777499 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-lib-modules\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.777644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777513 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-run-systemd\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.777644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777527 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-run-ovn\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.777644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777540 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/832735dc-0dda-465b-96fe-56bb39f2a72b-ovnkube-script-lib\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.777644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777562 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-system-cni-dir\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.777644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777587 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/aa9f1d02-c04d-4591-a2d3-aa61e92869ba-iptables-alerter-script\") pod \"iptables-alerter-sd8ch\" (UID: \"aa9f1d02-c04d-4591-a2d3-aa61e92869ba\") " pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.777644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777628 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777670 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777703 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/832735dc-0dda-465b-96fe-56bb39f2a72b-ovnkube-config\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777730 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-hostroot\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777754 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e63f1d9b-13b7-4099-ad63-64b33b70f697-hosts-file\") pod \"node-resolver-cqc8h\" (UID: \"e63f1d9b-13b7-4099-ad63-64b33b70f697\") " pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777774 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-tuned\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-registration-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777821 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777837 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-cni-bin\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777851 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-cnibin\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777865 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx5zj\" (UniqueName: \"kubernetes.io/projected/6b843f47-d57d-4596-961a-205314dbf0f8-kube-api-access-zx5zj\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777878 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777896 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777925 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j6wn\" (UniqueName: \"kubernetes.io/projected/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-kube-api-access-8j6wn\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nn2g\" (UniqueName: \"kubernetes.io/projected/e63f1d9b-13b7-4099-ad63-64b33b70f697-kube-api-access-7nn2g\") pod \"node-resolver-cqc8h\" (UID: \"e63f1d9b-13b7-4099-ad63-64b33b70f697\") " pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.777993 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-sysctl-conf\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.778076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778018 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-var-lib-openvswitch\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778033 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-os-release\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778047 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b843f47-d57d-4596-961a-205314dbf0f8-cni-binary-copy\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778062 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-run-k8s-cni-cncf-io\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-kubernetes\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778090 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-systemd\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778103 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-tmp\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778133 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-os-release\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778156 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-kubelet\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778170 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c3085fe-841e-4ff9-aa63-90a0b035c240-host\") pod \"node-ca-skwnw\" (UID: \"2c3085fe-841e-4ff9-aa63-90a0b035c240\") " pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778185 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl7nw\" (UniqueName: \"kubernetes.io/projected/2c3085fe-841e-4ff9-aa63-90a0b035c240-kube-api-access-xl7nw\") pod \"node-ca-skwnw\" (UID: \"2c3085fe-841e-4ff9-aa63-90a0b035c240\") " pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-var-lib-kubelet\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778252 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvqwj\" (UniqueName: \"kubernetes.io/projected/aa9f1d02-c04d-4591-a2d3-aa61e92869ba-kube-api-access-lvqwj\") pod \"iptables-alerter-sd8ch\" (UID: \"aa9f1d02-c04d-4591-a2d3-aa61e92869ba\") " pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-sysconfig\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778288 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-run\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778303 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-device-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.778871 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778317 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-system-cni-dir\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778333 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778370 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwszq\" (UniqueName: \"kubernetes.io/projected/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-kube-api-access-qwszq\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-run-openvswitch\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778424 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-run-multus-certs\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778438 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-sys\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778463 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778486 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-node-log\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-multus-cni-dir\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778528 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-multus-socket-dir-parent\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778542 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c3085fe-841e-4ff9-aa63-90a0b035c240-serviceca\") pod \"node-ca-skwnw\" (UID: \"2c3085fe-841e-4ff9-aa63-90a0b035c240\") " pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778559 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbdd2\" (UniqueName: \"kubernetes.io/projected/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-kube-api-access-bbdd2\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778586 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-sys-fs\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778600 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-run-netns\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778618 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ba1fb94a-02fe-4c28-9c64-63dbb3a0662a-agent-certs\") pod \"konnectivity-agent-7jkx8\" (UID: \"ba1fb94a-02fe-4c28-9c64-63dbb3a0662a\") " pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778634 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ba1fb94a-02fe-4c28-9c64-63dbb3a0662a-konnectivity-ca\") pod \"konnectivity-agent-7jkx8\" (UID: \"ba1fb94a-02fe-4c28-9c64-63dbb3a0662a\") " pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:02:40.779626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778663 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-socket-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778688 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-etc-openvswitch\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778706 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/832735dc-0dda-465b-96fe-56bb39f2a72b-env-overrides\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778730 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkl9n\" (UniqueName: \"kubernetes.io/projected/832735dc-0dda-465b-96fe-56bb39f2a72b-kube-api-access-hkl9n\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778764 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-var-lib-kubelet\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778786 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-sysctl-d\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778809 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aa9f1d02-c04d-4591-a2d3-aa61e92869ba-host-slash\") pod \"iptables-alerter-sd8ch\" (UID: \"aa9f1d02-c04d-4591-a2d3-aa61e92869ba\") " pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778833 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-cnibin\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778848 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-log-socket\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778868 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/832735dc-0dda-465b-96fe-56bb39f2a72b-ovn-node-metrics-cert\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-run-netns\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778912 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-var-lib-cni-bin\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778968 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-etc-kubernetes\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.778997 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-modprobe-d\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.779021 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-cni-binary-copy\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.779049 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-run-ovn-kubernetes\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.780446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.779070 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-multus-conf-dir\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.781053 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.779087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b843f47-d57d-4596-961a-205314dbf0f8-multus-daemon-config\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.781053 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.779116 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e63f1d9b-13b7-4099-ad63-64b33b70f697-tmp-dir\") pod \"node-resolver-cqc8h\" (UID: \"e63f1d9b-13b7-4099-ad63-64b33b70f697\") " pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.786694 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.786676 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 08:02:40.805416 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.805393 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7kt4j" Apr 17 08:02:40.812852 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.812829 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7kt4j" Apr 17 08:02:40.812973 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:40.812881 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19617c61026894db47a634e0cbb16491.slice/crio-5831024e3f0033daa6c56cac73698cf4eed64d5303a8e54a77e301cdb2b73bd3 WatchSource:0}: Error finding container 5831024e3f0033daa6c56cac73698cf4eed64d5303a8e54a77e301cdb2b73bd3: Status 404 returned error can't find the container with id 5831024e3f0033daa6c56cac73698cf4eed64d5303a8e54a77e301cdb2b73bd3 Apr 17 08:02:40.813228 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:40.813207 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23bf6540eb6d43d644d0a9f36f338fa6.slice/crio-a87f3826e6b184b6ce00f9dc775d53cc8700d25af1bbcb6633380317feab38a1 WatchSource:0}: Error finding container a87f3826e6b184b6ce00f9dc775d53cc8700d25af1bbcb6633380317feab38a1: Status 404 returned error can't find the container with id a87f3826e6b184b6ce00f9dc775d53cc8700d25af1bbcb6633380317feab38a1 Apr 17 08:02:40.817617 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.817603 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:02:40.879856 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879825 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-run-systemd\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.879856 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879855 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-run-ovn\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.880036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879871 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/832735dc-0dda-465b-96fe-56bb39f2a72b-ovnkube-script-lib\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.880036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-system-cni-dir\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.880036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879908 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/aa9f1d02-c04d-4591-a2d3-aa61e92869ba-iptables-alerter-script\") pod \"iptables-alerter-sd8ch\" (UID: \"aa9f1d02-c04d-4591-a2d3-aa61e92869ba\") " pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.880036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879921 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-run-ovn\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.880036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879931 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:40.880036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879926 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-run-systemd\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.880036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879988 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.880036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880010 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/832735dc-0dda-465b-96fe-56bb39f2a72b-ovnkube-config\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.880036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880028 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-hostroot\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880049 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e63f1d9b-13b7-4099-ad63-64b33b70f697-hosts-file\") pod \"node-resolver-cqc8h\" (UID: \"e63f1d9b-13b7-4099-ad63-64b33b70f697\") " pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:40.880055 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-tuned\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880079 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-hostroot\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880080 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:40.880117 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs podName:825bc295-b53d-4e6b-9c7e-ad30d2d38c65 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:41.380094961 +0000 UTC m=+2.137572385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs") pod "network-metrics-daemon-x6js9" (UID: "825bc295-b53d-4e6b-9c7e-ad30d2d38c65") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880134 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e63f1d9b-13b7-4099-ad63-64b33b70f697-hosts-file\") pod \"node-resolver-cqc8h\" (UID: \"e63f1d9b-13b7-4099-ad63-64b33b70f697\") " pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-registration-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880200 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880224 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-cni-bin\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-cnibin\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880274 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx5zj\" (UniqueName: \"kubernetes.io/projected/6b843f47-d57d-4596-961a-205314dbf0f8-kube-api-access-zx5zj\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-registration-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880322 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880334 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-cni-bin\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.880442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880353 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j6wn\" (UniqueName: \"kubernetes.io/projected/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-kube-api-access-8j6wn\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nn2g\" (UniqueName: \"kubernetes.io/projected/e63f1d9b-13b7-4099-ad63-64b33b70f697-kube-api-access-7nn2g\") pod \"node-resolver-cqc8h\" (UID: \"e63f1d9b-13b7-4099-ad63-64b33b70f697\") " pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880366 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-sysctl-conf\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-var-lib-openvswitch\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880462 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-os-release\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880468 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880482 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b843f47-d57d-4596-961a-205314dbf0f8-cni-binary-copy\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880505 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-run-k8s-cni-cncf-io\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880527 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-kubernetes\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-systemd\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/aa9f1d02-c04d-4591-a2d3-aa61e92869ba-iptables-alerter-script\") pod \"iptables-alerter-sd8ch\" (UID: \"aa9f1d02-c04d-4591-a2d3-aa61e92869ba\") " pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880575 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-tmp\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880599 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-os-release\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880624 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-kubelet\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880642 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/832735dc-0dda-465b-96fe-56bb39f2a72b-ovnkube-config\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/832735dc-0dda-465b-96fe-56bb39f2a72b-ovnkube-script-lib\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880658 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c3085fe-841e-4ff9-aa63-90a0b035c240-host\") pod \"node-ca-skwnw\" (UID: \"2c3085fe-841e-4ff9-aa63-90a0b035c240\") " pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.881248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880701 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2c3085fe-841e-4ff9-aa63-90a0b035c240-host\") pod \"node-ca-skwnw\" (UID: \"2c3085fe-841e-4ff9-aa63-90a0b035c240\") " pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880707 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880730 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl7nw\" (UniqueName: \"kubernetes.io/projected/2c3085fe-841e-4ff9-aa63-90a0b035c240-kube-api-access-xl7nw\") pod \"node-ca-skwnw\" (UID: \"2c3085fe-841e-4ff9-aa63-90a0b035c240\") " pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880710 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-run-k8s-cni-cncf-io\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880755 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-kubernetes\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880763 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-var-lib-kubelet\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880766 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-cnibin\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvqwj\" (UniqueName: \"kubernetes.io/projected/aa9f1d02-c04d-4591-a2d3-aa61e92869ba-kube-api-access-lvqwj\") pod \"iptables-alerter-sd8ch\" (UID: \"aa9f1d02-c04d-4591-a2d3-aa61e92869ba\") " pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-systemd\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-sysctl-conf\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880871 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-os-release\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-kubelet\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880923 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-sysconfig\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-run\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880973 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-var-lib-kubelet\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880994 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-os-release\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.880917 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-var-lib-openvswitch\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-sysconfig\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882098 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881051 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-run\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-device-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881172 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-system-cni-dir\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881199 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwszq\" (UniqueName: \"kubernetes.io/projected/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-kube-api-access-qwszq\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881242 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-device-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881253 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-run-openvswitch\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881278 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-run-multus-certs\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881297 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-system-cni-dir\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-sys\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881335 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-run-openvswitch\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881364 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-node-log\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881381 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-sys\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881391 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-multus-cni-dir\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881402 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-run-multus-certs\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881426 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-multus-socket-dir-parent\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.882921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881446 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-multus-cni-dir\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881454 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c3085fe-841e-4ff9-aa63-90a0b035c240-serviceca\") pod \"node-ca-skwnw\" (UID: \"2c3085fe-841e-4ff9-aa63-90a0b035c240\") " pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881469 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b843f47-d57d-4596-961a-205314dbf0f8-cni-binary-copy\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881480 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbdd2\" (UniqueName: \"kubernetes.io/projected/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-kube-api-access-bbdd2\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881488 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-node-log\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.879987 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-system-cni-dir\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881542 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-multus-socket-dir-parent\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881571 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-sys-fs\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881623 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-run-netns\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881653 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ba1fb94a-02fe-4c28-9c64-63dbb3a0662a-agent-certs\") pod \"konnectivity-agent-7jkx8\" (UID: \"ba1fb94a-02fe-4c28-9c64-63dbb3a0662a\") " pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881676 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-sys-fs\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881676 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ba1fb94a-02fe-4c28-9c64-63dbb3a0662a-konnectivity-ca\") pod \"konnectivity-agent-7jkx8\" (UID: \"ba1fb94a-02fe-4c28-9c64-63dbb3a0662a\") " pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-socket-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881839 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-etc-openvswitch\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881864 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/832735dc-0dda-465b-96fe-56bb39f2a72b-env-overrides\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881888 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkl9n\" (UniqueName: \"kubernetes.io/projected/832735dc-0dda-465b-96fe-56bb39f2a72b-kube-api-access-hkl9n\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-var-lib-kubelet\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881921 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2c3085fe-841e-4ff9-aa63-90a0b035c240-serviceca\") pod \"node-ca-skwnw\" (UID: \"2c3085fe-841e-4ff9-aa63-90a0b035c240\") " pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.883862 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-sysctl-d\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.881990 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882145 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-socket-dir\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882159 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-var-lib-kubelet\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882182 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-run-netns\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882205 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aa9f1d02-c04d-4591-a2d3-aa61e92869ba-host-slash\") pod \"iptables-alerter-sd8ch\" (UID: \"aa9f1d02-c04d-4591-a2d3-aa61e92869ba\") " pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882216 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-etc-openvswitch\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-cnibin\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882258 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aa9f1d02-c04d-4591-a2d3-aa61e92869ba-host-slash\") pod \"iptables-alerter-sd8ch\" (UID: \"aa9f1d02-c04d-4591-a2d3-aa61e92869ba\") " pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-log-socket\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882285 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-sysctl-d\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882320 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-log-socket\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882327 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-cnibin\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882320 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/832735dc-0dda-465b-96fe-56bb39f2a72b-env-overrides\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882361 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/832735dc-0dda-465b-96fe-56bb39f2a72b-ovn-node-metrics-cert\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882389 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-run-netns\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-var-lib-cni-bin\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882450 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-etc-kubernetes\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.884474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-run-netns\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-modprobe-d\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-cni-binary-copy\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882562 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-modprobe-d\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882568 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-run-ovn-kubernetes\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-multus-conf-dir\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882619 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b843f47-d57d-4596-961a-205314dbf0f8-multus-daemon-config\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882623 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-var-lib-cni-bin\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882644 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e63f1d9b-13b7-4099-ad63-64b33b70f697-tmp-dir\") pod \"node-resolver-cqc8h\" (UID: \"e63f1d9b-13b7-4099-ad63-64b33b70f697\") " pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-host\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882678 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-etc-kubernetes\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-etc-selinux\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882709 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882716 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-multus-conf-dir\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7279n\" (UniqueName: \"kubernetes.io/projected/d5a576c4-fd46-48a0-9584-c6849f6fca38-kube-api-access-7279n\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882771 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-run-ovn-kubernetes\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-systemd-units\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.885153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882824 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-slash\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882870 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-cni-netd\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882894 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-var-lib-cni-multus\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-lib-modules\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.882993 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ba1fb94a-02fe-4c28-9c64-63dbb3a0662a-konnectivity-ca\") pod \"konnectivity-agent-7jkx8\" (UID: \"ba1fb94a-02fe-4c28-9c64-63dbb3a0662a\") " pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883051 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-host\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883087 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-lib-modules\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883127 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-cni-netd\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883137 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-systemd-units\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883162 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b843f47-d57d-4596-961a-205314dbf0f8-host-var-lib-cni-multus\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883200 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/832735dc-0dda-465b-96fe-56bb39f2a72b-host-slash\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883212 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d5a576c4-fd46-48a0-9584-c6849f6fca38-etc-selinux\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883229 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b843f47-d57d-4596-961a-205314dbf0f8-multus-daemon-config\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-etc-tuned\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883556 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e63f1d9b-13b7-4099-ad63-64b33b70f697-tmp-dir\") pod \"node-resolver-cqc8h\" (UID: \"e63f1d9b-13b7-4099-ad63-64b33b70f697\") " pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883556 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-tmp\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.883782 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-cni-binary-copy\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.884289 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ba1fb94a-02fe-4c28-9c64-63dbb3a0662a-agent-certs\") pod \"konnectivity-agent-7jkx8\" (UID: \"ba1fb94a-02fe-4c28-9c64-63dbb3a0662a\") " pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:02:40.885725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.885407 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/832735dc-0dda-465b-96fe-56bb39f2a72b-ovn-node-metrics-cert\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.886356 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.885407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" event={"ID":"23bf6540eb6d43d644d0a9f36f338fa6","Type":"ContainerStarted","Data":"a87f3826e6b184b6ce00f9dc775d53cc8700d25af1bbcb6633380317feab38a1"} Apr 17 08:02:40.886356 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.886328 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal" event={"ID":"19617c61026894db47a634e0cbb16491","Type":"ContainerStarted","Data":"5831024e3f0033daa6c56cac73698cf4eed64d5303a8e54a77e301cdb2b73bd3"} Apr 17 08:02:40.887747 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:40.887733 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:40.887790 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:40.887749 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:40.887790 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:40.887758 2570 projected.go:194] Error preparing data for projected volume kube-api-access-qx998 for pod openshift-network-diagnostics/network-check-target-rkbs5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:40.887852 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:40.887821 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998 podName:36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd nodeName:}" failed. No retries permitted until 2026-04-17 08:02:41.38780681 +0000 UTC m=+2.145284230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qx998" (UniqueName: "kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998") pod "network-check-target-rkbs5" (UID: "36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:40.889486 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.889464 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvqwj\" (UniqueName: \"kubernetes.io/projected/aa9f1d02-c04d-4591-a2d3-aa61e92869ba-kube-api-access-lvqwj\") pod \"iptables-alerter-sd8ch\" (UID: \"aa9f1d02-c04d-4591-a2d3-aa61e92869ba\") " pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:40.890635 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.890608 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nn2g\" (UniqueName: \"kubernetes.io/projected/e63f1d9b-13b7-4099-ad63-64b33b70f697-kube-api-access-7nn2g\") pod \"node-resolver-cqc8h\" (UID: \"e63f1d9b-13b7-4099-ad63-64b33b70f697\") " pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:40.890735 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.890646 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl7nw\" (UniqueName: \"kubernetes.io/projected/2c3085fe-841e-4ff9-aa63-90a0b035c240-kube-api-access-xl7nw\") pod \"node-ca-skwnw\" (UID: \"2c3085fe-841e-4ff9-aa63-90a0b035c240\") " pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:40.890735 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.890646 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbdd2\" (UniqueName: \"kubernetes.io/projected/0e200b23-a648-49b4-9ee6-7b0e5ceaba16-kube-api-access-bbdd2\") pod \"tuned-2f72z\" (UID: \"0e200b23-a648-49b4-9ee6-7b0e5ceaba16\") " pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:40.891143 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.891122 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx5zj\" (UniqueName: \"kubernetes.io/projected/6b843f47-d57d-4596-961a-205314dbf0f8-kube-api-access-zx5zj\") pod \"multus-krs9k\" (UID: \"6b843f47-d57d-4596-961a-205314dbf0f8\") " pod="openshift-multus/multus-krs9k" Apr 17 08:02:40.891417 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.891400 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkl9n\" (UniqueName: \"kubernetes.io/projected/832735dc-0dda-465b-96fe-56bb39f2a72b-kube-api-access-hkl9n\") pod \"ovnkube-node-nt4p9\" (UID: \"832735dc-0dda-465b-96fe-56bb39f2a72b\") " pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:40.891707 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.891684 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j6wn\" (UniqueName: \"kubernetes.io/projected/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-kube-api-access-8j6wn\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:40.891884 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.891870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwszq\" (UniqueName: \"kubernetes.io/projected/26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1-kube-api-access-qwszq\") pod \"multus-additional-cni-plugins-xc6x7\" (UID: \"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1\") " pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:40.892183 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:40.892164 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7279n\" (UniqueName: \"kubernetes.io/projected/d5a576c4-fd46-48a0-9584-c6849f6fca38-kube-api-access-7279n\") pod \"aws-ebs-csi-driver-node-9tl68\" (UID: \"d5a576c4-fd46-48a0-9584-c6849f6fca38\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:41.089889 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.089816 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:02:41.096136 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:41.096106 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod832735dc_0dda_465b_96fe_56bb39f2a72b.slice/crio-c5f2c6acd2f8f0ee86943b107b54c336dd9307a87371dab69e2b5a9e6363d792 WatchSource:0}: Error finding container c5f2c6acd2f8f0ee86943b107b54c336dd9307a87371dab69e2b5a9e6363d792: Status 404 returned error can't find the container with id c5f2c6acd2f8f0ee86943b107b54c336dd9307a87371dab69e2b5a9e6363d792 Apr 17 08:02:41.108823 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.108803 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cqc8h" Apr 17 08:02:41.114855 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:41.114834 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode63f1d9b_13b7_4099_ad63_64b33b70f697.slice/crio-57dfccfc7a837f1da469286ff71bf81343f38795acb5a0b53029f37c013b5ae0 WatchSource:0}: Error finding container 57dfccfc7a837f1da469286ff71bf81343f38795acb5a0b53029f37c013b5ae0: Status 404 returned error can't find the container with id 57dfccfc7a837f1da469286ff71bf81343f38795acb5a0b53029f37c013b5ae0 Apr 17 08:02:41.119678 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.119662 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-krs9k" Apr 17 08:02:41.124109 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.124092 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2f72z" Apr 17 08:02:41.126400 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:41.126364 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b843f47_d57d_4596_961a_205314dbf0f8.slice/crio-ec6b8782c0a8837cf24ced926a9db35b5e5720c26ed2619b7717f0fdef3b28f7 WatchSource:0}: Error finding container ec6b8782c0a8837cf24ced926a9db35b5e5720c26ed2619b7717f0fdef3b28f7: Status 404 returned error can't find the container with id ec6b8782c0a8837cf24ced926a9db35b5e5720c26ed2619b7717f0fdef3b28f7 Apr 17 08:02:41.128809 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.128790 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:02:41.132751 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:41.132730 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e200b23_a648_49b4_9ee6_7b0e5ceaba16.slice/crio-5352262efd480d2f0dd05b8aed9288b95dd39bff4f04cf74a705f0342616938a WatchSource:0}: Error finding container 5352262efd480d2f0dd05b8aed9288b95dd39bff4f04cf74a705f0342616938a: Status 404 returned error can't find the container with id 5352262efd480d2f0dd05b8aed9288b95dd39bff4f04cf74a705f0342616938a Apr 17 08:02:41.135131 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.135115 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sd8ch" Apr 17 08:02:41.135610 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:41.135504 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba1fb94a_02fe_4c28_9c64_63dbb3a0662a.slice/crio-5682f8e9ef3147ff3d703f41d1e057a770cc639e5ace4fe4fa86dd750c4c0986 WatchSource:0}: Error finding container 5682f8e9ef3147ff3d703f41d1e057a770cc639e5ace4fe4fa86dd750c4c0986: Status 404 returned error can't find the container with id 5682f8e9ef3147ff3d703f41d1e057a770cc639e5ace4fe4fa86dd750c4c0986 Apr 17 08:02:41.141130 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.141112 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" Apr 17 08:02:41.142402 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:41.142382 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa9f1d02_c04d_4591_a2d3_aa61e92869ba.slice/crio-98cdb0ec26a3bce0d5019465bf256cd05a49c7b32965b824c516fde01a7abbc6 WatchSource:0}: Error finding container 98cdb0ec26a3bce0d5019465bf256cd05a49c7b32965b824c516fde01a7abbc6: Status 404 returned error can't find the container with id 98cdb0ec26a3bce0d5019465bf256cd05a49c7b32965b824c516fde01a7abbc6 Apr 17 08:02:41.146445 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.146427 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" Apr 17 08:02:41.147122 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:41.147098 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a576c4_fd46_48a0_9584_c6849f6fca38.slice/crio-d63e27e6bfc0a01262b40aa57947e878cb7c89ac56165e60751dc8e32bb138b3 WatchSource:0}: Error finding container d63e27e6bfc0a01262b40aa57947e878cb7c89ac56165e60751dc8e32bb138b3: Status 404 returned error can't find the container with id d63e27e6bfc0a01262b40aa57947e878cb7c89ac56165e60751dc8e32bb138b3 Apr 17 08:02:41.150432 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.150413 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-skwnw" Apr 17 08:02:41.152635 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:41.152596 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d3cacc_3bac_4f3c_9676_7ce9a23d4ae1.slice/crio-4919df93eb4914c2b8bc9e3460c9d0cb5a87c50823bef14f5378047578d72373 WatchSource:0}: Error finding container 4919df93eb4914c2b8bc9e3460c9d0cb5a87c50823bef14f5378047578d72373: Status 404 returned error can't find the container with id 4919df93eb4914c2b8bc9e3460c9d0cb5a87c50823bef14f5378047578d72373 Apr 17 08:02:41.156552 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:02:41.156535 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3085fe_841e_4ff9_aa63_90a0b035c240.slice/crio-4f1f405ed078eb4debd1b62a53cf287ca43c5eddd35149f1368050864e051d35 WatchSource:0}: Error finding container 4f1f405ed078eb4debd1b62a53cf287ca43c5eddd35149f1368050864e051d35: Status 404 returned error can't find the container with id 4f1f405ed078eb4debd1b62a53cf287ca43c5eddd35149f1368050864e051d35 Apr 17 08:02:41.386624 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.386508 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:41.386775 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:41.386675 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:41.386775 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:41.386738 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs podName:825bc295-b53d-4e6b-9c7e-ad30d2d38c65 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:42.386719055 +0000 UTC m=+3.144196478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs") pod "network-metrics-daemon-x6js9" (UID: "825bc295-b53d-4e6b-9c7e-ad30d2d38c65") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:41.487384 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.487320 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:41.487569 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:41.487494 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:41.487569 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:41.487512 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:41.487569 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:41.487531 2570 projected.go:194] Error preparing data for projected volume kube-api-access-qx998 for pod openshift-network-diagnostics/network-check-target-rkbs5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:41.487720 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:41.487586 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998 podName:36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd nodeName:}" failed. No retries permitted until 2026-04-17 08:02:42.487567153 +0000 UTC m=+3.245044567 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qx998" (UniqueName: "kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998") pod "network-check-target-rkbs5" (UID: "36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:41.701478 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.701381 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:41.814572 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.814534 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:57:40 +0000 UTC" deadline="2027-10-21 21:11:41.532709272 +0000 UTC" Apr 17 08:02:41.814572 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.814569 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13261h8m59.718144041s" Apr 17 08:02:41.906834 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.906791 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-skwnw" event={"ID":"2c3085fe-841e-4ff9-aa63-90a0b035c240","Type":"ContainerStarted","Data":"4f1f405ed078eb4debd1b62a53cf287ca43c5eddd35149f1368050864e051d35"} Apr 17 08:02:41.921479 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.921440 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sd8ch" event={"ID":"aa9f1d02-c04d-4591-a2d3-aa61e92869ba","Type":"ContainerStarted","Data":"98cdb0ec26a3bce0d5019465bf256cd05a49c7b32965b824c516fde01a7abbc6"} Apr 17 08:02:41.927408 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.927369 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-krs9k" event={"ID":"6b843f47-d57d-4596-961a-205314dbf0f8","Type":"ContainerStarted","Data":"ec6b8782c0a8837cf24ced926a9db35b5e5720c26ed2619b7717f0fdef3b28f7"} Apr 17 08:02:41.936955 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.936861 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cqc8h" event={"ID":"e63f1d9b-13b7-4099-ad63-64b33b70f697","Type":"ContainerStarted","Data":"57dfccfc7a837f1da469286ff71bf81343f38795acb5a0b53029f37c013b5ae0"} Apr 17 08:02:41.945507 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.945471 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" event={"ID":"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1","Type":"ContainerStarted","Data":"4919df93eb4914c2b8bc9e3460c9d0cb5a87c50823bef14f5378047578d72373"} Apr 17 08:02:41.954135 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.954060 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" event={"ID":"d5a576c4-fd46-48a0-9584-c6849f6fca38","Type":"ContainerStarted","Data":"d63e27e6bfc0a01262b40aa57947e878cb7c89ac56165e60751dc8e32bb138b3"} Apr 17 08:02:41.961031 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.960957 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7jkx8" event={"ID":"ba1fb94a-02fe-4c28-9c64-63dbb3a0662a","Type":"ContainerStarted","Data":"5682f8e9ef3147ff3d703f41d1e057a770cc639e5ace4fe4fa86dd750c4c0986"} Apr 17 08:02:41.972031 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.971995 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2f72z" event={"ID":"0e200b23-a648-49b4-9ee6-7b0e5ceaba16","Type":"ContainerStarted","Data":"5352262efd480d2f0dd05b8aed9288b95dd39bff4f04cf74a705f0342616938a"} Apr 17 08:02:41.981447 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:41.981414 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerStarted","Data":"c5f2c6acd2f8f0ee86943b107b54c336dd9307a87371dab69e2b5a9e6363d792"} Apr 17 08:02:42.008179 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.008132 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:42.227549 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.227210 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 08:02:42.402413 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.402376 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:42.402608 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.402536 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:42.402608 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.402602 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs podName:825bc295-b53d-4e6b-9c7e-ad30d2d38c65 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:44.402583331 +0000 UTC m=+5.160060753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs") pod "network-metrics-daemon-x6js9" (UID: "825bc295-b53d-4e6b-9c7e-ad30d2d38c65") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:42.504244 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.503559 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:42.504244 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.503745 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:42.504244 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.503764 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:42.504244 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.503777 2570 projected.go:194] Error preparing data for projected volume kube-api-access-qx998 for pod openshift-network-diagnostics/network-check-target-rkbs5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:42.504244 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.503835 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998 podName:36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd nodeName:}" failed. No retries permitted until 2026-04-17 08:02:44.503816569 +0000 UTC m=+5.261293978 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qx998" (UniqueName: "kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998") pod "network-check-target-rkbs5" (UID: "36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:42.675076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.675047 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6cfjr"] Apr 17 08:02:42.677701 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.677678 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:42.677824 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.677753 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:02:42.704991 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.704958 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/12c8f408-58f4-4cc4-a90f-967f072165d2-dbus\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:42.705420 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.705007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:42.705420 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.705067 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/12c8f408-58f4-4cc4-a90f-967f072165d2-kubelet-config\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:42.806480 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.806391 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/12c8f408-58f4-4cc4-a90f-967f072165d2-dbus\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:42.806480 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.806443 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:42.806695 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.806494 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/12c8f408-58f4-4cc4-a90f-967f072165d2-kubelet-config\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:42.806695 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.806606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/12c8f408-58f4-4cc4-a90f-967f072165d2-kubelet-config\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:42.806797 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.806751 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/12c8f408-58f4-4cc4-a90f-967f072165d2-dbus\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:42.806881 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.806852 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:42.806985 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.806915 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret podName:12c8f408-58f4-4cc4-a90f-967f072165d2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:43.306896738 +0000 UTC m=+4.064374153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret") pod "global-pull-secret-syncer-6cfjr" (UID: "12c8f408-58f4-4cc4-a90f-967f072165d2") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:42.815351 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.815308 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 07:57:40 +0000 UTC" deadline="2027-11-14 08:51:21.843871279 +0000 UTC" Apr 17 08:02:42.815351 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.815343 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13824h48m39.028532136s" Apr 17 08:02:42.882976 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.882931 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:42.883142 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.883095 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:42.883547 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:42.883528 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:42.883656 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:42.883616 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:43.310014 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:43.309983 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:43.310194 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:43.310122 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:43.310194 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:43.310184 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret podName:12c8f408-58f4-4cc4-a90f-967f072165d2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:44.310166723 +0000 UTC m=+5.067644129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret") pod "global-pull-secret-syncer-6cfjr" (UID: "12c8f408-58f4-4cc4-a90f-967f072165d2") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:43.885499 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:43.885468 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:43.885929 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:43.885596 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:02:44.316136 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:44.316047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:44.316321 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.316216 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:44.316321 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.316296 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret podName:12c8f408-58f4-4cc4-a90f-967f072165d2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:46.316275013 +0000 UTC m=+7.073752443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret") pod "global-pull-secret-syncer-6cfjr" (UID: "12c8f408-58f4-4cc4-a90f-967f072165d2") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:44.417641 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:44.417072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:44.417641 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.417216 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:44.417641 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.417280 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs podName:825bc295-b53d-4e6b-9c7e-ad30d2d38c65 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:48.41726055 +0000 UTC m=+9.174737959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs") pod "network-metrics-daemon-x6js9" (UID: "825bc295-b53d-4e6b-9c7e-ad30d2d38c65") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:44.518456 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:44.517819 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:44.518456 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.518013 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:44.518456 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.518033 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:44.518456 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.518046 2570 projected.go:194] Error preparing data for projected volume kube-api-access-qx998 for pod openshift-network-diagnostics/network-check-target-rkbs5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:44.518456 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.518104 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998 podName:36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd nodeName:}" failed. No retries permitted until 2026-04-17 08:02:48.518084864 +0000 UTC m=+9.275562273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qx998" (UniqueName: "kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998") pod "network-check-target-rkbs5" (UID: "36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:44.882844 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:44.882799 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:44.883055 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:44.882800 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:44.883055 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.882958 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:44.883188 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:44.883065 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:45.882489 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:45.882450 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:45.882955 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:45.882628 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:02:46.332061 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:46.331980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:46.332240 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:46.332168 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:46.332240 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:46.332227 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret podName:12c8f408-58f4-4cc4-a90f-967f072165d2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:50.3322101 +0000 UTC m=+11.089687506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret") pod "global-pull-secret-syncer-6cfjr" (UID: "12c8f408-58f4-4cc4-a90f-967f072165d2") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:46.882881 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:46.882590 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:46.882881 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:46.882633 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:46.882881 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:46.882757 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:46.882881 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:46.882828 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:47.882796 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:47.882759 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:47.882996 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:47.882905 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:02:48.448478 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:48.448433 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:48.448634 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:48.448603 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:48.448693 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:48.448674 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs podName:825bc295-b53d-4e6b-9c7e-ad30d2d38c65 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:56.448654713 +0000 UTC m=+17.206132133 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs") pod "network-metrics-daemon-x6js9" (UID: "825bc295-b53d-4e6b-9c7e-ad30d2d38c65") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:48.548955 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:48.548901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:48.549152 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:48.549106 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:48.549152 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:48.549128 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:48.549152 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:48.549141 2570 projected.go:194] Error preparing data for projected volume kube-api-access-qx998 for pod openshift-network-diagnostics/network-check-target-rkbs5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:48.549299 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:48.549194 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998 podName:36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd nodeName:}" failed. No retries permitted until 2026-04-17 08:02:56.549176631 +0000 UTC m=+17.306654048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qx998" (UniqueName: "kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998") pod "network-check-target-rkbs5" (UID: "36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:48.882598 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:48.882567 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:48.882741 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:48.882622 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:48.882741 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:48.882684 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:48.882855 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:48.882788 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:49.883956 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:49.883918 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:49.884445 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:49.884043 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:02:50.363996 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:50.363897 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:50.364164 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:50.364076 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:50.364164 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:50.364158 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret podName:12c8f408-58f4-4cc4-a90f-967f072165d2 nodeName:}" failed. No retries permitted until 2026-04-17 08:02:58.36413773 +0000 UTC m=+19.121615144 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret") pod "global-pull-secret-syncer-6cfjr" (UID: "12c8f408-58f4-4cc4-a90f-967f072165d2") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:50.882697 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:50.882664 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:50.882877 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:50.882662 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:50.882877 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:50.882814 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:50.882996 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:50.882869 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:51.883274 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:51.883240 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:51.883719 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:51.883378 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:02:52.882357 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:52.882317 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:52.882634 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:52.882318 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:52.882634 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:52.882464 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:52.882634 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:52.882528 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:53.882544 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:53.882399 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:53.882544 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:53.882525 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:02:54.882911 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:54.882878 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:54.883348 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:54.882878 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:54.883348 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:54.883025 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:54.883348 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:54.883069 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:55.882624 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:55.882593 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:55.882804 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:55.882729 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:02:56.507078 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:56.507037 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:56.507524 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:56.507197 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:56.507524 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:56.507259 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs podName:825bc295-b53d-4e6b-9c7e-ad30d2d38c65 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:12.507244506 +0000 UTC m=+33.264721911 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs") pod "network-metrics-daemon-x6js9" (UID: "825bc295-b53d-4e6b-9c7e-ad30d2d38c65") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:02:56.607593 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:56.607556 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:56.607754 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:56.607717 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:02:56.607754 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:56.607735 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:02:56.607754 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:56.607745 2570 projected.go:194] Error preparing data for projected volume kube-api-access-qx998 for pod openshift-network-diagnostics/network-check-target-rkbs5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:56.607881 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:56.607809 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998 podName:36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd nodeName:}" failed. No retries permitted until 2026-04-17 08:03:12.607790266 +0000 UTC m=+33.365267675 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qx998" (UniqueName: "kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998") pod "network-check-target-rkbs5" (UID: "36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:02:56.882631 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:56.882546 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:56.882631 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:56.882586 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:56.882848 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:56.882674 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:56.883092 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:56.883052 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:57.882583 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:57.882544 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:57.883000 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:57.882692 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:02:58.420031 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:58.419993 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:58.420233 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:58.420209 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:58.420352 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:58.420284 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret podName:12c8f408-58f4-4cc4-a90f-967f072165d2 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:14.420266682 +0000 UTC m=+35.177744094 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret") pod "global-pull-secret-syncer-6cfjr" (UID: "12c8f408-58f4-4cc4-a90f-967f072165d2") : object "kube-system"/"original-pull-secret" not registered Apr 17 08:02:58.882675 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:58.882653 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:02:58.883076 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:58.882759 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:02:58.883076 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:58.882662 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:02:58.883076 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:58.882949 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:02:59.014775 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:59.014558 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2f72z" event={"ID":"0e200b23-a648-49b4-9ee6-7b0e5ceaba16","Type":"ContainerStarted","Data":"a616add15bd38488fac4a6c56c6699a5e611026b1f8cdec82456d486eb45ab0f"} Apr 17 08:02:59.017024 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:59.016984 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerStarted","Data":"f8f41077f58a8f9dfb3f7c25963fb96dee405e1143d421588bf362515b8f4114"} Apr 17 08:02:59.019859 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:59.019835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal" event={"ID":"19617c61026894db47a634e0cbb16491","Type":"ContainerStarted","Data":"8670fdff610ae50fa544bf00a452f070e641d461c3eec06593d6031e344c3abc"} Apr 17 08:02:59.021335 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:59.021315 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-krs9k" event={"ID":"6b843f47-d57d-4596-961a-205314dbf0f8","Type":"ContainerStarted","Data":"b64f3cb1a182e8e46b5017533d2e7229e10984cc8e019d4f428bb6e51b4b5848"} Apr 17 08:02:59.042362 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:59.042160 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2f72z" podStartSLOduration=2.395192141 podStartE2EDuration="20.042148283s" podCreationTimestamp="2026-04-17 08:02:39 +0000 UTC" firstStartedPulling="2026-04-17 08:02:41.134255847 +0000 UTC m=+1.891733254" lastFinishedPulling="2026-04-17 08:02:58.781211988 +0000 UTC m=+19.538689396" observedRunningTime="2026-04-17 08:02:59.041548057 +0000 UTC m=+19.799025487" watchObservedRunningTime="2026-04-17 08:02:59.042148283 +0000 UTC m=+19.799625712" Apr 17 08:02:59.058143 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:59.058107 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-krs9k" podStartSLOduration=2.370832492 podStartE2EDuration="20.058097659s" podCreationTimestamp="2026-04-17 08:02:39 +0000 UTC" firstStartedPulling="2026-04-17 08:02:41.128585694 +0000 UTC m=+1.886063100" lastFinishedPulling="2026-04-17 08:02:58.815850858 +0000 UTC m=+19.573328267" observedRunningTime="2026-04-17 08:02:59.05777592 +0000 UTC m=+19.815253350" watchObservedRunningTime="2026-04-17 08:02:59.058097659 +0000 UTC m=+19.815575087" Apr 17 08:02:59.070991 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:59.070743 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-63.ec2.internal" podStartSLOduration=19.070729529 podStartE2EDuration="19.070729529s" podCreationTimestamp="2026-04-17 08:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:02:59.070394856 +0000 UTC m=+19.827872286" watchObservedRunningTime="2026-04-17 08:02:59.070729529 +0000 UTC m=+19.828207008" Apr 17 08:02:59.883309 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:02:59.883101 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:02:59.884020 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:02:59.883366 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:03:00.025907 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.025869 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovn-acl-logging/0.log" Apr 17 08:03:00.026204 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.026179 2570 generic.go:358] "Generic (PLEG): container finished" podID="832735dc-0dda-465b-96fe-56bb39f2a72b" containerID="e96aa427dda8f2b422edd63c2756b17577501714cb04709ece9982bd5b0f474c" exitCode=1 Apr 17 08:03:00.026282 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.026248 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerStarted","Data":"a0aeb288c315dd992f68cccc0a9b859357b49c07e1235b6c21badaaee347c9e7"} Apr 17 08:03:00.026324 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.026285 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerStarted","Data":"f011a0e6edfa3266d52e18b90ba96d043829175fae83d2920ce067f958032c85"} Apr 17 08:03:00.026324 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.026299 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerStarted","Data":"180be59eec32ab53202005b6beaa75305e448ad74c7a94f91ae4280cefe5dff6"} Apr 17 08:03:00.026324 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.026312 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerStarted","Data":"089e747fdeb287739762263bc0feb4ba1b0ee0d058becae17d326aeb8736dc4b"} Apr 17 08:03:00.026432 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.026324 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerDied","Data":"e96aa427dda8f2b422edd63c2756b17577501714cb04709ece9982bd5b0f474c"} Apr 17 08:03:00.027577 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.027539 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-skwnw" event={"ID":"2c3085fe-841e-4ff9-aa63-90a0b035c240","Type":"ContainerStarted","Data":"fb0641b5ccc366762be62abecbe2e37addb6b127a193e444138121128a5399be"} Apr 17 08:03:00.028863 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.028837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cqc8h" event={"ID":"e63f1d9b-13b7-4099-ad63-64b33b70f697","Type":"ContainerStarted","Data":"dc1f6b1e3172215514f8f551b50a88ff92a72d492fa866323183dbab353f0f31"} Apr 17 08:03:00.030124 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.030102 2570 generic.go:358] "Generic (PLEG): container finished" podID="23bf6540eb6d43d644d0a9f36f338fa6" containerID="a763b9f2bca5c805d389e7d4a77ab95966e4af9f956c83466be5c2666405611a" exitCode=0 Apr 17 08:03:00.030224 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.030171 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" event={"ID":"23bf6540eb6d43d644d0a9f36f338fa6","Type":"ContainerDied","Data":"a763b9f2bca5c805d389e7d4a77ab95966e4af9f956c83466be5c2666405611a"} Apr 17 08:03:00.031678 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.031655 2570 generic.go:358] "Generic (PLEG): container finished" podID="26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1" containerID="1ebe54740b1d4e5fb3b5bc07dfab721f44f8c731df3f1be6a268279c17b83882" exitCode=0 Apr 17 08:03:00.031787 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.031721 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" event={"ID":"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1","Type":"ContainerDied","Data":"1ebe54740b1d4e5fb3b5bc07dfab721f44f8c731df3f1be6a268279c17b83882"} Apr 17 08:03:00.033349 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.033322 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" event={"ID":"d5a576c4-fd46-48a0-9584-c6849f6fca38","Type":"ContainerStarted","Data":"bacfa902e944061ef51bcc6a66ab388fe260006509573af0973ce452bd681ecd"} Apr 17 08:03:00.034786 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.034763 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7jkx8" event={"ID":"ba1fb94a-02fe-4c28-9c64-63dbb3a0662a","Type":"ContainerStarted","Data":"ef32a90991066d4ed29091c60765113b22e3af160b725b71c0b6308dd004eff5"} Apr 17 08:03:00.045354 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.045303 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-skwnw" podStartSLOduration=2.421722744 podStartE2EDuration="20.045289726s" podCreationTimestamp="2026-04-17 08:02:40 +0000 UTC" firstStartedPulling="2026-04-17 08:02:41.15785626 +0000 UTC m=+1.915333670" lastFinishedPulling="2026-04-17 08:02:58.781423243 +0000 UTC m=+19.538900652" observedRunningTime="2026-04-17 08:03:00.044819748 +0000 UTC m=+20.802297176" watchObservedRunningTime="2026-04-17 08:03:00.045289726 +0000 UTC m=+20.802767153" Apr 17 08:03:00.098379 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.098321 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7jkx8" podStartSLOduration=3.482283442 podStartE2EDuration="21.098304355s" podCreationTimestamp="2026-04-17 08:02:39 +0000 UTC" firstStartedPulling="2026-04-17 08:02:41.137007884 +0000 UTC m=+1.894485290" lastFinishedPulling="2026-04-17 08:02:58.753028793 +0000 UTC m=+19.510506203" observedRunningTime="2026-04-17 08:03:00.082210149 +0000 UTC m=+20.839687576" watchObservedRunningTime="2026-04-17 08:03:00.098304355 +0000 UTC m=+20.855781784" Apr 17 08:03:00.112175 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.112117 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cqc8h" podStartSLOduration=3.447109332 podStartE2EDuration="21.112100821s" podCreationTimestamp="2026-04-17 08:02:39 +0000 UTC" firstStartedPulling="2026-04-17 08:02:41.116259349 +0000 UTC m=+1.873736756" lastFinishedPulling="2026-04-17 08:02:58.781250828 +0000 UTC m=+19.538728245" observedRunningTime="2026-04-17 08:03:00.112012281 +0000 UTC m=+20.869489710" watchObservedRunningTime="2026-04-17 08:03:00.112100821 +0000 UTC m=+20.869578251" Apr 17 08:03:00.465984 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.465812 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 08:03:00.846995 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.846808 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T08:03:00.46598026Z","UUID":"6295964f-d166-457e-a022-96421617474e","Handler":null,"Name":"","Endpoint":""} Apr 17 08:03:00.849856 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.849834 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 08:03:00.850008 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.849864 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 08:03:00.883087 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.883059 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:00.883087 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:00.883085 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:00.883308 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:00.883184 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:03:00.883308 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:00.883268 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:03:01.039567 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:01.039321 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" event={"ID":"23bf6540eb6d43d644d0a9f36f338fa6","Type":"ContainerStarted","Data":"06e9d268b5f614c6fc600248e3e42a03f6caca12bbaeaa357291392398ea59ec"} Apr 17 08:03:01.041813 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:01.041774 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" event={"ID":"d5a576c4-fd46-48a0-9584-c6849f6fca38","Type":"ContainerStarted","Data":"859dbd8a48230a35f8ba4bb3df737144521aada248a9be429eaac647a9e0d7de"} Apr 17 08:03:01.043539 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:01.043511 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sd8ch" event={"ID":"aa9f1d02-c04d-4591-a2d3-aa61e92869ba","Type":"ContainerStarted","Data":"fdf5ad317a2c0514af31ae3adebf91bd5032fe0d1ab1c8360a1663c975d54868"} Apr 17 08:03:01.054973 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:01.054905 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-63.ec2.internal" podStartSLOduration=21.054890313 podStartE2EDuration="21.054890313s" podCreationTimestamp="2026-04-17 08:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:03:01.054381883 +0000 UTC m=+21.811859314" watchObservedRunningTime="2026-04-17 08:03:01.054890313 +0000 UTC m=+21.812367746" Apr 17 08:03:01.069151 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:01.069108 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-sd8ch" podStartSLOduration=3.431714489 podStartE2EDuration="21.069096088s" podCreationTimestamp="2026-04-17 08:02:40 +0000 UTC" firstStartedPulling="2026-04-17 08:02:41.143958763 +0000 UTC m=+1.901436173" lastFinishedPulling="2026-04-17 08:02:58.781340348 +0000 UTC m=+19.538817772" observedRunningTime="2026-04-17 08:03:01.068608178 +0000 UTC m=+21.826085605" watchObservedRunningTime="2026-04-17 08:03:01.069096088 +0000 UTC m=+21.826573562" Apr 17 08:03:01.856171 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:01.856093 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:03:01.882524 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:01.882496 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:01.882705 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:01.882616 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:03:02.048575 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:02.048544 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovn-acl-logging/0.log" Apr 17 08:03:02.049220 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:02.048899 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerStarted","Data":"bc7afc0e69686b4f8c8e57c587205743d69dac2520ce251fa702c8c3ff2f3ff6"} Apr 17 08:03:02.050750 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:02.050720 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" event={"ID":"d5a576c4-fd46-48a0-9584-c6849f6fca38","Type":"ContainerStarted","Data":"f25b870b491b5b525944e6053e7a084e69859afef8baa35e4e01d8c09fdc3146"} Apr 17 08:03:02.071688 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:02.071639 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9tl68" podStartSLOduration=1.7309806540000001 podStartE2EDuration="22.071622632s" podCreationTimestamp="2026-04-17 08:02:40 +0000 UTC" firstStartedPulling="2026-04-17 08:02:41.149113384 +0000 UTC m=+1.906590794" lastFinishedPulling="2026-04-17 08:03:01.489755349 +0000 UTC m=+22.247232772" observedRunningTime="2026-04-17 08:03:02.071417782 +0000 UTC m=+22.828895210" watchObservedRunningTime="2026-04-17 08:03:02.071622632 +0000 UTC m=+22.829100062" Apr 17 08:03:02.882542 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:02.882510 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:02.882743 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:02.882513 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:02.882743 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:02.882633 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:03:02.882743 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:02.882733 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:03:03.883370 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:03.883053 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:03.884041 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:03.883376 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:03:03.912923 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:03.912898 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:03:03.913499 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:03.913473 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:03:04.056917 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:04.056888 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovn-acl-logging/0.log" Apr 17 08:03:04.057229 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:04.057207 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerStarted","Data":"c4c3f1f3f5eb7d3480aa6fe87c7a085aff6de76706b93fa09965463ea892fdac"} Apr 17 08:03:04.057688 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:04.057656 2570 scope.go:117] "RemoveContainer" containerID="e96aa427dda8f2b422edd63c2756b17577501714cb04709ece9982bd5b0f474c" Apr 17 08:03:04.058443 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:04.058426 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7jkx8" Apr 17 08:03:04.883302 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:04.883242 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:04.883582 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:04.883243 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:04.883582 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:04.883369 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:03:04.883582 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:04.883469 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:03:05.606236 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:05.606199 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rkbs5"] Apr 17 08:03:05.606427 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:05.606315 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:05.606496 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:05.606417 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:03:05.608629 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:05.608569 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6cfjr"] Apr 17 08:03:05.608768 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:05.608697 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:05.608828 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:05.608797 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:03:05.609572 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:05.609483 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x6js9"] Apr 17 08:03:05.609668 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:05.609576 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:05.609668 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:05.609648 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:03:06.065046 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.065023 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovn-acl-logging/0.log" Apr 17 08:03:06.065412 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.065298 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" event={"ID":"832735dc-0dda-465b-96fe-56bb39f2a72b","Type":"ContainerStarted","Data":"14cb06fa04d86e41935f1159829882f96cfdf46725971ba2ef4446db201d9c37"} Apr 17 08:03:06.065603 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.065564 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:03:06.065603 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.065591 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:03:06.065702 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.065604 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:03:06.079883 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.079851 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:03:06.080046 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.080026 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:03:06.094703 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.094662 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" podStartSLOduration=9.299713428 podStartE2EDuration="27.094644295s" podCreationTimestamp="2026-04-17 08:02:39 +0000 UTC" firstStartedPulling="2026-04-17 08:02:41.098439033 +0000 UTC m=+1.855916440" lastFinishedPulling="2026-04-17 08:02:58.893369898 +0000 UTC m=+19.650847307" observedRunningTime="2026-04-17 08:03:06.092780468 +0000 UTC m=+26.850257919" watchObservedRunningTime="2026-04-17 08:03:06.094644295 +0000 UTC m=+26.852121727" Apr 17 08:03:06.882758 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.882730 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:06.883037 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:06.882739 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:06.883037 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:06.882837 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:03:06.883037 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:06.882958 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:03:07.071489 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:07.071445 2570 generic.go:358] "Generic (PLEG): container finished" podID="26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1" containerID="a6cba4a5f1ec57fc13979c9d0917c9ef5ee99f8ff01afa692024c9a141c874f6" exitCode=0 Apr 17 08:03:07.071843 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:07.071543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" event={"ID":"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1","Type":"ContainerDied","Data":"a6cba4a5f1ec57fc13979c9d0917c9ef5ee99f8ff01afa692024c9a141c874f6"} Apr 17 08:03:07.883351 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:07.883047 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:07.883482 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:07.883454 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:03:08.084668 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:08.084609 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" podUID="832735dc-0dda-465b-96fe-56bb39f2a72b" containerName="ovnkube-controller" probeResult="failure" output="" Apr 17 08:03:08.883065 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:08.883038 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:08.883263 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:08.883038 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:08.883263 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:08.883153 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:03:08.883263 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:08.883207 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:03:09.077023 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:09.076988 2570 generic.go:358] "Generic (PLEG): container finished" podID="26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1" containerID="f8bdeeeaf561507844385c419be5ab7a5c790f8b7b53030e5467bcadf10d807f" exitCode=0 Apr 17 08:03:09.077184 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:09.077050 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" event={"ID":"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1","Type":"ContainerDied","Data":"f8bdeeeaf561507844385c419be5ab7a5c790f8b7b53030e5467bcadf10d807f"} Apr 17 08:03:09.883504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:09.883428 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:09.883990 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:09.883527 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:03:10.080855 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:10.080824 2570 generic.go:358] "Generic (PLEG): container finished" podID="26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1" containerID="487272d2b16f0298b5d415318c8e5d17fbfcfb916e076feff6d9e3bfda612940" exitCode=0 Apr 17 08:03:10.081041 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:10.080880 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" event={"ID":"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1","Type":"ContainerDied","Data":"487272d2b16f0298b5d415318c8e5d17fbfcfb916e076feff6d9e3bfda612940"} Apr 17 08:03:10.883302 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:10.883267 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:10.883506 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:10.883267 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:10.883506 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:10.883381 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rkbs5" podUID="36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd" Apr 17 08:03:10.883506 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:10.883475 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-6cfjr" podUID="12c8f408-58f4-4cc4-a90f-967f072165d2" Apr 17 08:03:11.883397 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:11.883322 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:11.883547 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:11.883479 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:03:12.028016 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.027988 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-63.ec2.internal" event="NodeReady" Apr 17 08:03:12.028396 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.028126 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 08:03:12.060045 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.060013 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-64cb969cfb-fb4ps"] Apr 17 08:03:12.067341 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.067317 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.070434 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.070406 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 08:03:12.070772 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.070750 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5m5pl\"" Apr 17 08:03:12.070899 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.070871 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 08:03:12.076636 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.071647 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 08:03:12.076636 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.075873 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-252zk"] Apr 17 08:03:12.082186 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.082165 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 08:03:12.086959 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.086921 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64cb969cfb-fb4ps"] Apr 17 08:03:12.087078 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.086966 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wp9g5"] Apr 17 08:03:12.087218 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.087114 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.089713 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.089670 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 08:03:12.089868 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.089844 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 08:03:12.089978 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.089872 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wdhxs\"" Apr 17 08:03:12.094633 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.094612 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-252zk"] Apr 17 08:03:12.094733 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.094641 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wp9g5"] Apr 17 08:03:12.094827 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.094768 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:12.097295 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.097266 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 08:03:12.097482 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.097346 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 08:03:12.097630 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.097614 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 08:03:12.097730 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.097710 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x9bwm\"" Apr 17 08:03:12.224385 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224303 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-registry-certificates\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.224385 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224341 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-installation-pull-secrets\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.224584 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224438 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mb7c\" (UniqueName: \"kubernetes.io/projected/116a85c5-54d8-4462-9305-b1de37bca8cf-kube-api-access-6mb7c\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:12.224584 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224475 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/421db932-74ef-4855-b174-a7ce6bca201b-config-volume\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.224584 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224502 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/421db932-74ef-4855-b174-a7ce6bca201b-tmp-dir\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.224700 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-bound-sa-token\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.224700 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224621 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.224700 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224648 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-trusted-ca\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.224700 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73989292-93a5-4241-9cd4-5833981ca4eb-ca-trust-extracted\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.224881 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224715 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:12.224881 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224741 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btkct\" (UniqueName: \"kubernetes.io/projected/421db932-74ef-4855-b174-a7ce6bca201b-kube-api-access-btkct\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.224881 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224832 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-image-registry-private-configuration\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.224881 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224868 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdddd\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-kube-api-access-vdddd\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.225043 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.224885 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.325452 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.325410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73989292-93a5-4241-9cd4-5833981ca4eb-ca-trust-extracted\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.325627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.325459 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:12.325627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.325488 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btkct\" (UniqueName: \"kubernetes.io/projected/421db932-74ef-4855-b174-a7ce6bca201b-kube-api-access-btkct\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.325627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.325542 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-image-registry-private-configuration\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.325627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.325573 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdddd\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-kube-api-access-vdddd\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.325855 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.325777 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:12.325855 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.325814 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.325855 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.325844 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert podName:116a85c5-54d8-4462-9305-b1de37bca8cf nodeName:}" failed. No retries permitted until 2026-04-17 08:03:12.82582345 +0000 UTC m=+33.583300862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert") pod "ingress-canary-wp9g5" (UID: "116a85c5-54d8-4462-9305-b1de37bca8cf") : secret "canary-serving-cert" not found Apr 17 08:03:12.326062 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.325886 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:12.326062 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.325905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-registry-certificates\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.326062 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.325914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73989292-93a5-4241-9cd4-5833981ca4eb-ca-trust-extracted\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.326062 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.325934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-installation-pull-secrets\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.326062 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.325997 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls podName:421db932-74ef-4855-b174-a7ce6bca201b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:12.825977325 +0000 UTC m=+33.583454733 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls") pod "dns-default-252zk" (UID: "421db932-74ef-4855-b174-a7ce6bca201b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:12.326062 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.326050 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mb7c\" (UniqueName: \"kubernetes.io/projected/116a85c5-54d8-4462-9305-b1de37bca8cf-kube-api-access-6mb7c\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:12.326388 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.326080 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/421db932-74ef-4855-b174-a7ce6bca201b-config-volume\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.326388 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.326106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/421db932-74ef-4855-b174-a7ce6bca201b-tmp-dir\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.326388 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.326142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-bound-sa-token\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.326388 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.326169 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.326388 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.326193 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-trusted-ca\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.326388 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.326381 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:03:12.326682 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.326396 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64cb969cfb-fb4ps: secret "image-registry-tls" not found Apr 17 08:03:12.326682 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.326447 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls podName:73989292-93a5-4241-9cd4-5833981ca4eb nodeName:}" failed. No retries permitted until 2026-04-17 08:03:12.82643314 +0000 UTC m=+33.583910557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls") pod "image-registry-64cb969cfb-fb4ps" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb") : secret "image-registry-tls" not found Apr 17 08:03:12.326682 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.326444 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-registry-certificates\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.327387 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.327174 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-trusted-ca\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.327505 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.327257 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/421db932-74ef-4855-b174-a7ce6bca201b-config-volume\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.327505 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.327486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/421db932-74ef-4855-b174-a7ce6bca201b-tmp-dir\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.331197 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.331177 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-installation-pull-secrets\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.331274 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.331182 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-image-registry-private-configuration\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.338463 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.338439 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdddd\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-kube-api-access-vdddd\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.338885 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.338863 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-bound-sa-token\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.339014 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.338989 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btkct\" (UniqueName: \"kubernetes.io/projected/421db932-74ef-4855-b174-a7ce6bca201b-kube-api-access-btkct\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.339406 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.339385 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mb7c\" (UniqueName: \"kubernetes.io/projected/116a85c5-54d8-4462-9305-b1de37bca8cf-kube-api-access-6mb7c\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:12.527382 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.527345 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:12.527566 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.527532 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:03:12.527623 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.527612 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs podName:825bc295-b53d-4e6b-9c7e-ad30d2d38c65 nodeName:}" failed. No retries permitted until 2026-04-17 08:03:44.527589014 +0000 UTC m=+65.285066421 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs") pod "network-metrics-daemon-x6js9" (UID: "825bc295-b53d-4e6b-9c7e-ad30d2d38c65") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 08:03:12.628335 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.628296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:12.628507 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.628453 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 08:03:12.628507 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.628477 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 08:03:12.628507 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.628491 2570 projected.go:194] Error preparing data for projected volume kube-api-access-qx998 for pod openshift-network-diagnostics/network-check-target-rkbs5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:03:12.628650 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.628550 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998 podName:36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd nodeName:}" failed. No retries permitted until 2026-04-17 08:03:44.628535777 +0000 UTC m=+65.386013198 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qx998" (UniqueName: "kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998") pod "network-check-target-rkbs5" (UID: "36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 08:03:12.830676 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.830585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:12.830676 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.830645 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:12.830907 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.830705 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:12.830907 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.830730 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:03:12.830907 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.830753 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64cb969cfb-fb4ps: secret "image-registry-tls" not found Apr 17 08:03:12.830907 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.830810 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls podName:73989292-93a5-4241-9cd4-5833981ca4eb nodeName:}" failed. No retries permitted until 2026-04-17 08:03:13.830788073 +0000 UTC m=+34.588265479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls") pod "image-registry-64cb969cfb-fb4ps" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb") : secret "image-registry-tls" not found Apr 17 08:03:12.830907 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.830809 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:12.830907 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.830831 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:12.830907 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.830863 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert podName:116a85c5-54d8-4462-9305-b1de37bca8cf nodeName:}" failed. No retries permitted until 2026-04-17 08:03:13.830848807 +0000 UTC m=+34.588326216 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert") pod "ingress-canary-wp9g5" (UID: "116a85c5-54d8-4462-9305-b1de37bca8cf") : secret "canary-serving-cert" not found Apr 17 08:03:12.830907 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:12.830884 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls podName:421db932-74ef-4855-b174-a7ce6bca201b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:13.830869495 +0000 UTC m=+34.588346908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls") pod "dns-default-252zk" (UID: "421db932-74ef-4855-b174-a7ce6bca201b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:12.883168 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.883133 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:12.883339 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.883179 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:12.886272 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.886228 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 08:03:12.886410 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.886276 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 08:03:12.886410 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.886286 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 08:03:12.886527 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:12.886444 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nfrwk\"" Apr 17 08:03:13.840335 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:13.840286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:13.840828 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:13.840450 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:13.840828 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:13.840502 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:13.840828 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:13.840545 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls podName:421db932-74ef-4855-b174-a7ce6bca201b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:15.840507702 +0000 UTC m=+36.597985111 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls") pod "dns-default-252zk" (UID: "421db932-74ef-4855-b174-a7ce6bca201b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:13.840828 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:13.840599 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:13.840828 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:13.840605 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:03:13.840828 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:13.840663 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64cb969cfb-fb4ps: secret "image-registry-tls" not found Apr 17 08:03:13.840828 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:13.840666 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:13.840828 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:13.840700 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls podName:73989292-93a5-4241-9cd4-5833981ca4eb nodeName:}" failed. No retries permitted until 2026-04-17 08:03:15.840687396 +0000 UTC m=+36.598164814 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls") pod "image-registry-64cb969cfb-fb4ps" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb") : secret "image-registry-tls" not found Apr 17 08:03:13.840828 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:13.840722 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert podName:116a85c5-54d8-4462-9305-b1de37bca8cf nodeName:}" failed. No retries permitted until 2026-04-17 08:03:15.840706251 +0000 UTC m=+36.598183660 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert") pod "ingress-canary-wp9g5" (UID: "116a85c5-54d8-4462-9305-b1de37bca8cf") : secret "canary-serving-cert" not found Apr 17 08:03:13.883185 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:13.883155 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:13.887718 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:13.887687 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 08:03:13.887888 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:13.887733 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x2n8l\"" Apr 17 08:03:14.445952 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:14.445900 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:14.448984 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:14.448955 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12c8f408-58f4-4cc4-a90f-967f072165d2-original-pull-secret\") pod \"global-pull-secret-syncer-6cfjr\" (UID: \"12c8f408-58f4-4cc4-a90f-967f072165d2\") " pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:14.695096 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:14.695054 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6cfjr" Apr 17 08:03:15.620472 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:15.620438 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6cfjr"] Apr 17 08:03:15.632787 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:03:15.632753 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c8f408_58f4_4cc4_a90f_967f072165d2.slice/crio-bb18e21305d456ce85e157390c5de6051e396efcfcfebd5482328759717a0122 WatchSource:0}: Error finding container bb18e21305d456ce85e157390c5de6051e396efcfcfebd5482328759717a0122: Status 404 returned error can't find the container with id bb18e21305d456ce85e157390c5de6051e396efcfcfebd5482328759717a0122 Apr 17 08:03:15.856746 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:15.856713 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:15.856921 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:15.856768 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:15.856921 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:15.856849 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:15.856921 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:15.856861 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:15.856921 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:15.856898 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls podName:421db932-74ef-4855-b174-a7ce6bca201b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:19.85688472 +0000 UTC m=+40.614362129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls") pod "dns-default-252zk" (UID: "421db932-74ef-4855-b174-a7ce6bca201b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:15.856921 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:15.856913 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert podName:116a85c5-54d8-4462-9305-b1de37bca8cf nodeName:}" failed. No retries permitted until 2026-04-17 08:03:19.856906766 +0000 UTC m=+40.614384172 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert") pod "ingress-canary-wp9g5" (UID: "116a85c5-54d8-4462-9305-b1de37bca8cf") : secret "canary-serving-cert" not found Apr 17 08:03:15.857136 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:15.856982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:15.857136 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:15.857060 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:03:15.857136 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:15.857070 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64cb969cfb-fb4ps: secret "image-registry-tls" not found Apr 17 08:03:15.857136 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:15.857102 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls podName:73989292-93a5-4241-9cd4-5833981ca4eb nodeName:}" failed. No retries permitted until 2026-04-17 08:03:19.857091523 +0000 UTC m=+40.614568936 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls") pod "image-registry-64cb969cfb-fb4ps" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb") : secret "image-registry-tls" not found Apr 17 08:03:16.094140 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:16.094102 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6cfjr" event={"ID":"12c8f408-58f4-4cc4-a90f-967f072165d2","Type":"ContainerStarted","Data":"bb18e21305d456ce85e157390c5de6051e396efcfcfebd5482328759717a0122"} Apr 17 08:03:16.096200 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:16.096171 2570 generic.go:358] "Generic (PLEG): container finished" podID="26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1" containerID="97dc1d1bd1f624425f01371a6b12708a236f6b39a8e9d57f6cb72fd05d09163e" exitCode=0 Apr 17 08:03:16.096308 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:16.096224 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" event={"ID":"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1","Type":"ContainerDied","Data":"97dc1d1bd1f624425f01371a6b12708a236f6b39a8e9d57f6cb72fd05d09163e"} Apr 17 08:03:17.101678 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:17.101640 2570 generic.go:358] "Generic (PLEG): container finished" podID="26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1" containerID="b80292b5ad3992bbe803856d7e5ea76302ddb8cec222aa8ba2f6119c609598fb" exitCode=0 Apr 17 08:03:17.102198 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:17.101708 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" event={"ID":"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1","Type":"ContainerDied","Data":"b80292b5ad3992bbe803856d7e5ea76302ddb8cec222aa8ba2f6119c609598fb"} Apr 17 08:03:18.109290 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:18.109252 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" event={"ID":"26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1","Type":"ContainerStarted","Data":"afc7092beeea3a8943c05493167e7a14c1f896d81693025619c59cf437a20d8e"} Apr 17 08:03:18.132468 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:18.132415 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xc6x7" podStartSLOduration=3.626608508 podStartE2EDuration="38.132401599s" podCreationTimestamp="2026-04-17 08:02:40 +0000 UTC" firstStartedPulling="2026-04-17 08:02:41.154276746 +0000 UTC m=+1.911754154" lastFinishedPulling="2026-04-17 08:03:15.660069825 +0000 UTC m=+36.417547245" observedRunningTime="2026-04-17 08:03:18.130221817 +0000 UTC m=+38.887699248" watchObservedRunningTime="2026-04-17 08:03:18.132401599 +0000 UTC m=+38.889879027" Apr 17 08:03:19.894726 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:19.894696 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:19.895102 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:19.894748 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:19.895102 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:19.894825 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:03:19.895102 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:19.894837 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:19.895102 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:19.894842 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64cb969cfb-fb4ps: secret "image-registry-tls" not found Apr 17 08:03:19.895102 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:19.894877 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert podName:116a85c5-54d8-4462-9305-b1de37bca8cf nodeName:}" failed. No retries permitted until 2026-04-17 08:03:27.894865115 +0000 UTC m=+48.652342522 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert") pod "ingress-canary-wp9g5" (UID: "116a85c5-54d8-4462-9305-b1de37bca8cf") : secret "canary-serving-cert" not found Apr 17 08:03:19.895102 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:19.894889 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls podName:73989292-93a5-4241-9cd4-5833981ca4eb nodeName:}" failed. No retries permitted until 2026-04-17 08:03:27.894883184 +0000 UTC m=+48.652360590 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls") pod "image-registry-64cb969cfb-fb4ps" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb") : secret "image-registry-tls" not found Apr 17 08:03:19.895102 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:19.894902 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:19.895102 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:19.894982 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:19.895102 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:19.895010 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls podName:421db932-74ef-4855-b174-a7ce6bca201b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:27.895003244 +0000 UTC m=+48.652480650 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls") pod "dns-default-252zk" (UID: "421db932-74ef-4855-b174-a7ce6bca201b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:20.114023 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:20.113982 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6cfjr" event={"ID":"12c8f408-58f4-4cc4-a90f-967f072165d2","Type":"ContainerStarted","Data":"3209be6244e81216a7c5b98c4cd6dfc7affc0b067c50cbcf7e6a10ce821c657d"} Apr 17 08:03:20.129323 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:20.129271 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6cfjr" podStartSLOduration=34.268625209 podStartE2EDuration="38.129257162s" podCreationTimestamp="2026-04-17 08:02:42 +0000 UTC" firstStartedPulling="2026-04-17 08:03:15.637524861 +0000 UTC m=+36.395002267" lastFinishedPulling="2026-04-17 08:03:19.498156814 +0000 UTC m=+40.255634220" observedRunningTime="2026-04-17 08:03:20.128879988 +0000 UTC m=+40.886357415" watchObservedRunningTime="2026-04-17 08:03:20.129257162 +0000 UTC m=+40.886734644" Apr 17 08:03:27.951067 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:27.951026 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:27.951578 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:27.951092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:27.951578 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:27.951132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:27.951578 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:27.951212 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:27.951578 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:27.951237 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:03:27.951578 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:27.951252 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64cb969cfb-fb4ps: secret "image-registry-tls" not found Apr 17 08:03:27.951578 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:27.951284 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:27.951578 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:27.951319 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls podName:421db932-74ef-4855-b174-a7ce6bca201b nodeName:}" failed. No retries permitted until 2026-04-17 08:03:43.95129631 +0000 UTC m=+64.708773723 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls") pod "dns-default-252zk" (UID: "421db932-74ef-4855-b174-a7ce6bca201b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:27.951578 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:27.951343 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls podName:73989292-93a5-4241-9cd4-5833981ca4eb nodeName:}" failed. No retries permitted until 2026-04-17 08:03:43.951329836 +0000 UTC m=+64.708807242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls") pod "image-registry-64cb969cfb-fb4ps" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb") : secret "image-registry-tls" not found Apr 17 08:03:27.951578 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:27.951358 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert podName:116a85c5-54d8-4462-9305-b1de37bca8cf nodeName:}" failed. No retries permitted until 2026-04-17 08:03:43.951350805 +0000 UTC m=+64.708828210 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert") pod "ingress-canary-wp9g5" (UID: "116a85c5-54d8-4462-9305-b1de37bca8cf") : secret "canary-serving-cert" not found Apr 17 08:03:38.083205 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:38.083176 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nt4p9" Apr 17 08:03:43.960495 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:43.960460 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:03:43.961011 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:43.960513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:03:43.961011 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:43.960558 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:03:43.961011 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:43.960633 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:03:43.961011 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:43.960651 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:03:43.961011 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:43.960664 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64cb969cfb-fb4ps: secret "image-registry-tls" not found Apr 17 08:03:43.961011 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:43.960665 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:03:43.961011 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:43.960711 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls podName:73989292-93a5-4241-9cd4-5833981ca4eb nodeName:}" failed. No retries permitted until 2026-04-17 08:04:15.960696988 +0000 UTC m=+96.718174394 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls") pod "image-registry-64cb969cfb-fb4ps" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb") : secret "image-registry-tls" not found Apr 17 08:03:43.961011 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:43.960724 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert podName:116a85c5-54d8-4462-9305-b1de37bca8cf nodeName:}" failed. No retries permitted until 2026-04-17 08:04:15.960718037 +0000 UTC m=+96.718195443 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert") pod "ingress-canary-wp9g5" (UID: "116a85c5-54d8-4462-9305-b1de37bca8cf") : secret "canary-serving-cert" not found Apr 17 08:03:43.961011 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:43.960733 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls podName:421db932-74ef-4855-b174-a7ce6bca201b nodeName:}" failed. No retries permitted until 2026-04-17 08:04:15.960728443 +0000 UTC m=+96.718205850 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls") pod "dns-default-252zk" (UID: "421db932-74ef-4855-b174-a7ce6bca201b") : secret "dns-default-metrics-tls" not found Apr 17 08:03:44.565796 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:44.565750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:03:44.568625 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:44.568604 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 08:03:44.576903 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:44.576881 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 08:03:44.577011 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:03:44.576984 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs podName:825bc295-b53d-4e6b-9c7e-ad30d2d38c65 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:48.576959839 +0000 UTC m=+129.334437244 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs") pod "network-metrics-daemon-x6js9" (UID: "825bc295-b53d-4e6b-9c7e-ad30d2d38c65") : secret "metrics-daemon-secret" not found Apr 17 08:03:44.666665 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:44.666631 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:44.669670 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:44.669652 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 08:03:44.680180 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:44.680166 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 08:03:44.691815 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:44.691793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx998\" (UniqueName: \"kubernetes.io/projected/36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd-kube-api-access-qx998\") pod \"network-check-target-rkbs5\" (UID: \"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd\") " pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:44.703107 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:44.703087 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-nfrwk\"" Apr 17 08:03:44.711253 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:44.711238 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:44.820139 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:44.820068 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rkbs5"] Apr 17 08:03:44.831966 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:03:44.831920 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36ae9c9b_1417_4e3d_8f1a_e54cbe63c9dd.slice/crio-46c667758fa0941c47fbc8ae2be1636b01731e3e43b1e431a2d8cfd07ae8301d WatchSource:0}: Error finding container 46c667758fa0941c47fbc8ae2be1636b01731e3e43b1e431a2d8cfd07ae8301d: Status 404 returned error can't find the container with id 46c667758fa0941c47fbc8ae2be1636b01731e3e43b1e431a2d8cfd07ae8301d Apr 17 08:03:45.160699 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:45.160614 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rkbs5" event={"ID":"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd","Type":"ContainerStarted","Data":"46c667758fa0941c47fbc8ae2be1636b01731e3e43b1e431a2d8cfd07ae8301d"} Apr 17 08:03:48.167376 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:48.167341 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rkbs5" event={"ID":"36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd","Type":"ContainerStarted","Data":"478680de52aa15f00484b9e9ba5eed4b1fe1e5f12fcdc2ab45fe0420df96dd7a"} Apr 17 08:03:48.167755 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:48.167482 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:03:48.183726 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:03:48.183673 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rkbs5" podStartSLOduration=66.460259807 podStartE2EDuration="1m9.183659526s" podCreationTimestamp="2026-04-17 08:02:39 +0000 UTC" firstStartedPulling="2026-04-17 08:03:44.833716309 +0000 UTC m=+65.591193718" lastFinishedPulling="2026-04-17 08:03:47.557116026 +0000 UTC m=+68.314593437" observedRunningTime="2026-04-17 08:03:48.183481978 +0000 UTC m=+68.940959406" watchObservedRunningTime="2026-04-17 08:03:48.183659526 +0000 UTC m=+68.941136953" Apr 17 08:04:15.977706 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:15.977665 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:04:15.977706 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:15.977713 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:04:15.978335 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:15.977746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:04:15.978335 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:15.977833 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 08:04:15.978335 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:15.977857 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64cb969cfb-fb4ps: secret "image-registry-tls" not found Apr 17 08:04:15.978335 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:15.977865 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 08:04:15.978335 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:15.977906 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 08:04:15.978335 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:15.977923 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert podName:116a85c5-54d8-4462-9305-b1de37bca8cf nodeName:}" failed. No retries permitted until 2026-04-17 08:05:19.97790871 +0000 UTC m=+160.735386118 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert") pod "ingress-canary-wp9g5" (UID: "116a85c5-54d8-4462-9305-b1de37bca8cf") : secret "canary-serving-cert" not found Apr 17 08:04:15.978335 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:15.977951 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls podName:73989292-93a5-4241-9cd4-5833981ca4eb nodeName:}" failed. No retries permitted until 2026-04-17 08:05:19.977931453 +0000 UTC m=+160.735408858 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls") pod "image-registry-64cb969cfb-fb4ps" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb") : secret "image-registry-tls" not found Apr 17 08:04:15.978335 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:15.977978 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls podName:421db932-74ef-4855-b174-a7ce6bca201b nodeName:}" failed. No retries permitted until 2026-04-17 08:05:19.977965179 +0000 UTC m=+160.735442591 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls") pod "dns-default-252zk" (UID: "421db932-74ef-4855-b174-a7ce6bca201b") : secret "dns-default-metrics-tls" not found Apr 17 08:04:19.172214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:19.172178 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rkbs5" Apr 17 08:04:20.205932 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.205885 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5"] Apr 17 08:04:20.221512 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.221468 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5"] Apr 17 08:04:20.221663 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.221590 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.227417 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.227389 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 08:04:20.227572 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.227389 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 08:04:20.227650 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.227588 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 08:04:20.227709 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.227651 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-r9xfp\"" Apr 17 08:04:20.227709 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.227695 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 08:04:20.309306 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.309270 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.309306 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.309311 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhwz2\" (UniqueName: \"kubernetes.io/projected/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-kube-api-access-lhwz2\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.309547 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.309351 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.309977 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.309956 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-vwznj"] Apr 17 08:04:20.320631 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.320604 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.321626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.321585 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-vwznj"] Apr 17 08:04:20.323349 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.323323 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 08:04:20.323473 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.323323 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 08:04:20.323473 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.323380 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 08:04:20.323976 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.323917 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-n28ns\"" Apr 17 08:04:20.324088 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.324041 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 08:04:20.329253 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.329235 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 08:04:20.410609 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.410567 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-snapshots\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.410609 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.410606 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.410835 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.410686 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-service-ca-bundle\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.410835 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.410721 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.410835 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.410742 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-tmp\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.410835 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.410759 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-serving-cert\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.411045 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.410859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhwz2\" (UniqueName: \"kubernetes.io/projected/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-kube-api-access-lhwz2\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.411045 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:20.410871 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:20.411045 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.410895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.411045 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.410915 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq8st\" (UniqueName: \"kubernetes.io/projected/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-kube-api-access-fq8st\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.411045 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:20.410971 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls podName:6c639c32-2f50-4f3f-9ba1-c99215cb7e01 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:20.910928541 +0000 UTC m=+101.668405960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f85s5" (UID: "6c639c32-2f50-4f3f-9ba1-c99215cb7e01") : secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:20.412328 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.412309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.419322 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.419301 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhwz2\" (UniqueName: \"kubernetes.io/projected/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-kube-api-access-lhwz2\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.512152 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.512120 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-service-ca-bundle\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.512330 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.512164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-tmp\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.512330 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.512180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-serving-cert\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.512330 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.512218 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq8st\" (UniqueName: \"kubernetes.io/projected/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-kube-api-access-fq8st\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.512330 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.512271 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-snapshots\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.512330 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.512296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.512596 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.512569 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-tmp\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.512789 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.512760 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-service-ca-bundle\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.512924 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.512827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-snapshots\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.513114 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.513093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.514547 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.514531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-serving-cert\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.520622 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.520597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq8st\" (UniqueName: \"kubernetes.io/projected/29f0e625-d1d8-412e-8a62-f3d9c9c33c3e-kube-api-access-fq8st\") pod \"insights-operator-585dfdc468-vwznj\" (UID: \"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e\") " pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.630697 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.630661 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-vwznj" Apr 17 08:04:20.742199 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.742165 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-vwznj"] Apr 17 08:04:20.744747 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:04:20.744707 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f0e625_d1d8_412e_8a62_f3d9c9c33c3e.slice/crio-daec60d1a93d45672692fa3078dc1a635878142dd69f16d55345f4cdc28e92af WatchSource:0}: Error finding container daec60d1a93d45672692fa3078dc1a635878142dd69f16d55345f4cdc28e92af: Status 404 returned error can't find the container with id daec60d1a93d45672692fa3078dc1a635878142dd69f16d55345f4cdc28e92af Apr 17 08:04:20.914555 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:20.914467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:20.914708 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:20.914619 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:20.914708 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:20.914688 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls podName:6c639c32-2f50-4f3f-9ba1-c99215cb7e01 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:21.914671962 +0000 UTC m=+102.672149369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f85s5" (UID: "6c639c32-2f50-4f3f-9ba1-c99215cb7e01") : secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:21.228060 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:21.227974 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vwznj" event={"ID":"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e","Type":"ContainerStarted","Data":"daec60d1a93d45672692fa3078dc1a635878142dd69f16d55345f4cdc28e92af"} Apr 17 08:04:21.924037 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:21.923995 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:21.924221 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:21.924169 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:21.924269 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:21.924249 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls podName:6c639c32-2f50-4f3f-9ba1-c99215cb7e01 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:23.924225618 +0000 UTC m=+104.681703039 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f85s5" (UID: "6c639c32-2f50-4f3f-9ba1-c99215cb7e01") : secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:23.233359 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:23.233314 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vwznj" event={"ID":"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e","Type":"ContainerStarted","Data":"d1e9cae1a4f122a885ba55ca2a6bdbb538b1dd844ea7a9f6bf8c35db1321c6cc"} Apr 17 08:04:23.249696 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:23.249624 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-vwznj" podStartSLOduration=0.884119842 podStartE2EDuration="3.249611886s" podCreationTimestamp="2026-04-17 08:04:20 +0000 UTC" firstStartedPulling="2026-04-17 08:04:20.746350376 +0000 UTC m=+101.503827782" lastFinishedPulling="2026-04-17 08:04:23.111842417 +0000 UTC m=+103.869319826" observedRunningTime="2026-04-17 08:04:23.24883617 +0000 UTC m=+104.006313598" watchObservedRunningTime="2026-04-17 08:04:23.249611886 +0000 UTC m=+104.007089313" Apr 17 08:04:23.937973 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:23.937911 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:23.938149 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:23.938048 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:23.938149 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:23.938111 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls podName:6c639c32-2f50-4f3f-9ba1-c99215cb7e01 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:27.938095404 +0000 UTC m=+108.695572816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f85s5" (UID: "6c639c32-2f50-4f3f-9ba1-c99215cb7e01") : secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:26.464986 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:26.464959 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cqc8h_e63f1d9b-13b7-4099-ad63-64b33b70f697/dns-node-resolver/0.log" Apr 17 08:04:27.665146 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:27.665118 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-skwnw_2c3085fe-841e-4ff9-aa63-90a0b035c240/node-ca/0.log" Apr 17 08:04:27.967457 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:27.967376 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:27.967611 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:27.967516 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:27.967611 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:27.967586 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls podName:6c639c32-2f50-4f3f-9ba1-c99215cb7e01 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:35.967568005 +0000 UTC m=+116.725045430 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f85s5" (UID: "6c639c32-2f50-4f3f-9ba1-c99215cb7e01") : secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:30.204079 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.204042 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d"] Apr 17 08:04:30.206383 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.206366 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:30.209139 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.209114 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-9dxhj\"" Apr 17 08:04:30.209231 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.209121 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:04:30.209231 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.209124 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 08:04:30.210541 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.210523 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 08:04:30.217096 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.217077 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d"] Apr 17 08:04:30.285711 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.285682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w24b\" (UniqueName: \"kubernetes.io/projected/daaeed21-ac53-4784-abb1-fc080fe469a9-kube-api-access-6w24b\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:30.285834 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.285820 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:30.302681 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.302658 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb"] Apr 17 08:04:30.304773 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.304758 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.307472 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.307443 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 08:04:30.307591 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.307514 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 08:04:30.307697 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.307676 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bmnrd\"" Apr 17 08:04:30.307697 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.307691 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:04:30.307826 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.307798 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 08:04:30.309211 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.309192 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p"] Apr 17 08:04:30.311376 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.311358 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p" Apr 17 08:04:30.316217 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.316198 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 08:04:30.317150 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.317129 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb"] Apr 17 08:04:30.317804 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.317787 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 08:04:30.317947 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.317920 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-kgc4f\"" Apr 17 08:04:30.322793 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.322774 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p"] Apr 17 08:04:30.386633 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.386605 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8rnw\" (UniqueName: \"kubernetes.io/projected/ba78d13c-1ebd-4761-b661-3cc6591106b7-kube-api-access-v8rnw\") pod \"volume-data-source-validator-7c6cbb6c87-vwc9p\" (UID: \"ba78d13c-1ebd-4761-b661-3cc6591106b7\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p" Apr 17 08:04:30.386800 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.386652 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d36857-0992-41ad-aa34-4e41c08ace48-serving-cert\") pod \"service-ca-operator-d6fc45fc5-csrrb\" (UID: \"c9d36857-0992-41ad-aa34-4e41c08ace48\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.386800 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.386721 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d36857-0992-41ad-aa34-4e41c08ace48-config\") pod \"service-ca-operator-d6fc45fc5-csrrb\" (UID: \"c9d36857-0992-41ad-aa34-4e41c08ace48\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.386800 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.386751 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w24b\" (UniqueName: \"kubernetes.io/projected/daaeed21-ac53-4784-abb1-fc080fe469a9-kube-api-access-6w24b\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:30.386981 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.386843 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zbkr\" (UniqueName: \"kubernetes.io/projected/c9d36857-0992-41ad-aa34-4e41c08ace48-kube-api-access-4zbkr\") pod \"service-ca-operator-d6fc45fc5-csrrb\" (UID: \"c9d36857-0992-41ad-aa34-4e41c08ace48\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.386981 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.386873 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:30.386981 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:30.386972 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 08:04:30.387107 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:30.387020 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls podName:daaeed21-ac53-4784-abb1-fc080fe469a9 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:30.887006704 +0000 UTC m=+111.644484110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z86d" (UID: "daaeed21-ac53-4784-abb1-fc080fe469a9") : secret "samples-operator-tls" not found Apr 17 08:04:30.397528 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.397508 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w24b\" (UniqueName: \"kubernetes.io/projected/daaeed21-ac53-4784-abb1-fc080fe469a9-kube-api-access-6w24b\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:30.487626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.487553 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zbkr\" (UniqueName: \"kubernetes.io/projected/c9d36857-0992-41ad-aa34-4e41c08ace48-kube-api-access-4zbkr\") pod \"service-ca-operator-d6fc45fc5-csrrb\" (UID: \"c9d36857-0992-41ad-aa34-4e41c08ace48\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.487626 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.487621 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8rnw\" (UniqueName: \"kubernetes.io/projected/ba78d13c-1ebd-4761-b661-3cc6591106b7-kube-api-access-v8rnw\") pod \"volume-data-source-validator-7c6cbb6c87-vwc9p\" (UID: \"ba78d13c-1ebd-4761-b661-3cc6591106b7\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p" Apr 17 08:04:30.487765 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.487645 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d36857-0992-41ad-aa34-4e41c08ace48-serving-cert\") pod \"service-ca-operator-d6fc45fc5-csrrb\" (UID: \"c9d36857-0992-41ad-aa34-4e41c08ace48\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.487765 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.487691 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d36857-0992-41ad-aa34-4e41c08ace48-config\") pod \"service-ca-operator-d6fc45fc5-csrrb\" (UID: \"c9d36857-0992-41ad-aa34-4e41c08ace48\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.488201 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.488184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d36857-0992-41ad-aa34-4e41c08ace48-config\") pod \"service-ca-operator-d6fc45fc5-csrrb\" (UID: \"c9d36857-0992-41ad-aa34-4e41c08ace48\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.489808 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.489793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d36857-0992-41ad-aa34-4e41c08ace48-serving-cert\") pod \"service-ca-operator-d6fc45fc5-csrrb\" (UID: \"c9d36857-0992-41ad-aa34-4e41c08ace48\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.495520 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.495500 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zbkr\" (UniqueName: \"kubernetes.io/projected/c9d36857-0992-41ad-aa34-4e41c08ace48-kube-api-access-4zbkr\") pod \"service-ca-operator-d6fc45fc5-csrrb\" (UID: \"c9d36857-0992-41ad-aa34-4e41c08ace48\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.495619 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.495507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8rnw\" (UniqueName: \"kubernetes.io/projected/ba78d13c-1ebd-4761-b661-3cc6591106b7-kube-api-access-v8rnw\") pod \"volume-data-source-validator-7c6cbb6c87-vwc9p\" (UID: \"ba78d13c-1ebd-4761-b661-3cc6591106b7\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p" Apr 17 08:04:30.613833 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.613793 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" Apr 17 08:04:30.620420 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.620393 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p" Apr 17 08:04:30.732676 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.732649 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb"] Apr 17 08:04:30.735628 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:04:30.735599 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d36857_0992_41ad_aa34_4e41c08ace48.slice/crio-2989f0da971cf059f2e1a536ac138dc1bf68e1d8561cb05166af434e22a12e7a WatchSource:0}: Error finding container 2989f0da971cf059f2e1a536ac138dc1bf68e1d8561cb05166af434e22a12e7a: Status 404 returned error can't find the container with id 2989f0da971cf059f2e1a536ac138dc1bf68e1d8561cb05166af434e22a12e7a Apr 17 08:04:30.750105 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.750051 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p"] Apr 17 08:04:30.753409 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:04:30.753385 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba78d13c_1ebd_4761_b661_3cc6591106b7.slice/crio-dbf291552644547335c59619c11c81bff31819db503ffdc24fb8846443d4c681 WatchSource:0}: Error finding container dbf291552644547335c59619c11c81bff31819db503ffdc24fb8846443d4c681: Status 404 returned error can't find the container with id dbf291552644547335c59619c11c81bff31819db503ffdc24fb8846443d4c681 Apr 17 08:04:30.890517 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:30.890479 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:30.890663 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:30.890627 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 08:04:30.890703 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:30.890689 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls podName:daaeed21-ac53-4784-abb1-fc080fe469a9 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:31.890674255 +0000 UTC m=+112.648151660 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z86d" (UID: "daaeed21-ac53-4784-abb1-fc080fe469a9") : secret "samples-operator-tls" not found Apr 17 08:04:31.249103 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:31.249052 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p" event={"ID":"ba78d13c-1ebd-4761-b661-3cc6591106b7","Type":"ContainerStarted","Data":"dbf291552644547335c59619c11c81bff31819db503ffdc24fb8846443d4c681"} Apr 17 08:04:31.250152 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:31.250122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" event={"ID":"c9d36857-0992-41ad-aa34-4e41c08ace48","Type":"ContainerStarted","Data":"2989f0da971cf059f2e1a536ac138dc1bf68e1d8561cb05166af434e22a12e7a"} Apr 17 08:04:31.898711 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:31.898674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:31.898907 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:31.898839 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 08:04:31.899009 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:31.898910 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls podName:daaeed21-ac53-4784-abb1-fc080fe469a9 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:33.898888849 +0000 UTC m=+114.656366268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z86d" (UID: "daaeed21-ac53-4784-abb1-fc080fe469a9") : secret "samples-operator-tls" not found Apr 17 08:04:32.253066 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:32.253031 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p" event={"ID":"ba78d13c-1ebd-4761-b661-3cc6591106b7","Type":"ContainerStarted","Data":"2c4b48f04446adc5da4f569961e7011bd4c30e62499991b1d6867dff2b1f4814"} Apr 17 08:04:32.267165 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:32.267120 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vwc9p" podStartSLOduration=0.906529427 podStartE2EDuration="2.267105814s" podCreationTimestamp="2026-04-17 08:04:30 +0000 UTC" firstStartedPulling="2026-04-17 08:04:30.75521205 +0000 UTC m=+111.512689457" lastFinishedPulling="2026-04-17 08:04:32.115788424 +0000 UTC m=+112.873265844" observedRunningTime="2026-04-17 08:04:32.266467858 +0000 UTC m=+113.023945298" watchObservedRunningTime="2026-04-17 08:04:32.267105814 +0000 UTC m=+113.024583267" Apr 17 08:04:33.913904 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:33.913822 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:33.914261 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:33.913980 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 08:04:33.914261 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:33.914040 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls podName:daaeed21-ac53-4784-abb1-fc080fe469a9 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:37.914023198 +0000 UTC m=+118.671500626 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z86d" (UID: "daaeed21-ac53-4784-abb1-fc080fe469a9") : secret "samples-operator-tls" not found Apr 17 08:04:34.261021 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:34.260976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" event={"ID":"c9d36857-0992-41ad-aa34-4e41c08ace48","Type":"ContainerStarted","Data":"667190fc7c0578db12977b7d2b4fc820931437af8ab5d42a8dbd1965ec5b7888"} Apr 17 08:04:34.276475 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:34.276430 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" podStartSLOduration=1.389945909 podStartE2EDuration="4.27641473s" podCreationTimestamp="2026-04-17 08:04:30 +0000 UTC" firstStartedPulling="2026-04-17 08:04:30.737425157 +0000 UTC m=+111.494902563" lastFinishedPulling="2026-04-17 08:04:33.623893975 +0000 UTC m=+114.381371384" observedRunningTime="2026-04-17 08:04:34.275339274 +0000 UTC m=+115.032816708" watchObservedRunningTime="2026-04-17 08:04:34.27641473 +0000 UTC m=+115.033892158" Apr 17 08:04:34.771321 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:34.771285 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m"] Apr 17 08:04:34.773898 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:34.773882 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m" Apr 17 08:04:34.776607 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:34.776583 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 08:04:34.776607 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:34.776596 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 08:04:34.776764 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:34.776584 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-g4pgg\"" Apr 17 08:04:34.788220 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:34.788197 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m"] Apr 17 08:04:34.922912 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:34.922877 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srqrp\" (UniqueName: \"kubernetes.io/projected/48f0adf1-9c65-4866-b146-76db151d34d3-kube-api-access-srqrp\") pod \"migrator-74bb7799d9-55t9m\" (UID: \"48f0adf1-9c65-4866-b146-76db151d34d3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m" Apr 17 08:04:35.023525 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:35.023442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srqrp\" (UniqueName: \"kubernetes.io/projected/48f0adf1-9c65-4866-b146-76db151d34d3-kube-api-access-srqrp\") pod \"migrator-74bb7799d9-55t9m\" (UID: \"48f0adf1-9c65-4866-b146-76db151d34d3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m" Apr 17 08:04:35.031434 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:35.031406 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srqrp\" (UniqueName: \"kubernetes.io/projected/48f0adf1-9c65-4866-b146-76db151d34d3-kube-api-access-srqrp\") pod \"migrator-74bb7799d9-55t9m\" (UID: \"48f0adf1-9c65-4866-b146-76db151d34d3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m" Apr 17 08:04:35.083437 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:35.083408 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m" Apr 17 08:04:35.196663 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:35.196601 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m"] Apr 17 08:04:35.199587 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:04:35.199552 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f0adf1_9c65_4866_b146_76db151d34d3.slice/crio-8bc97b79da4fede49f8718f2aa470d02a735fd25198c9c0e3e6f2162c28eb7f3 WatchSource:0}: Error finding container 8bc97b79da4fede49f8718f2aa470d02a735fd25198c9c0e3e6f2162c28eb7f3: Status 404 returned error can't find the container with id 8bc97b79da4fede49f8718f2aa470d02a735fd25198c9c0e3e6f2162c28eb7f3 Apr 17 08:04:35.264610 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:35.264580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m" event={"ID":"48f0adf1-9c65-4866-b146-76db151d34d3","Type":"ContainerStarted","Data":"8bc97b79da4fede49f8718f2aa470d02a735fd25198c9c0e3e6f2162c28eb7f3"} Apr 17 08:04:36.030971 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:36.030912 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:36.031455 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:36.031090 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:36.031455 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:36.031184 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls podName:6c639c32-2f50-4f3f-9ba1-c99215cb7e01 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:52.031161231 +0000 UTC m=+132.788638642 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-f85s5" (UID: "6c639c32-2f50-4f3f-9ba1-c99215cb7e01") : secret "cluster-monitoring-operator-tls" not found Apr 17 08:04:37.272035 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:37.271998 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m" event={"ID":"48f0adf1-9c65-4866-b146-76db151d34d3","Type":"ContainerStarted","Data":"f3731ac7fd0ec963b11f5f14f4074fcdc20770d9e047e6c3eac75b0177efee22"} Apr 17 08:04:37.272035 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:37.272036 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m" event={"ID":"48f0adf1-9c65-4866-b146-76db151d34d3","Type":"ContainerStarted","Data":"421c63375d5974e19bb5d2019a4a8b2dc210f4f5e6d5bfeeb14c868d5c5aaf33"} Apr 17 08:04:37.290104 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:37.290060 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-55t9m" podStartSLOduration=1.7466689899999999 podStartE2EDuration="3.290047483s" podCreationTimestamp="2026-04-17 08:04:34 +0000 UTC" firstStartedPulling="2026-04-17 08:04:35.201436517 +0000 UTC m=+115.958913926" lastFinishedPulling="2026-04-17 08:04:36.744815005 +0000 UTC m=+117.502292419" observedRunningTime="2026-04-17 08:04:37.289224321 +0000 UTC m=+118.046701750" watchObservedRunningTime="2026-04-17 08:04:37.290047483 +0000 UTC m=+118.047524910" Apr 17 08:04:37.948764 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:37.948724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:37.948911 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:37.948869 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 08:04:37.948967 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:37.948928 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls podName:daaeed21-ac53-4784-abb1-fc080fe469a9 nodeName:}" failed. No retries permitted until 2026-04-17 08:04:45.948913073 +0000 UTC m=+126.706390480 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2z86d" (UID: "daaeed21-ac53-4784-abb1-fc080fe469a9") : secret "samples-operator-tls" not found Apr 17 08:04:46.012653 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:46.012604 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:46.014989 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:46.014970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/daaeed21-ac53-4784-abb1-fc080fe469a9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2z86d\" (UID: \"daaeed21-ac53-4784-abb1-fc080fe469a9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:46.118042 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:46.118012 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-9dxhj\"" Apr 17 08:04:46.125404 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:46.125385 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" Apr 17 08:04:46.236132 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:46.236106 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d"] Apr 17 08:04:46.297903 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:46.297875 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" event={"ID":"daaeed21-ac53-4784-abb1-fc080fe469a9","Type":"ContainerStarted","Data":"d12c2ffd747326323338e53c2eb0805ff69a34a0a00fd0177279e062de3bb537"} Apr 17 08:04:48.632934 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:48.632833 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:04:48.633352 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:48.633024 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 08:04:48.633352 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:04:48.633128 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs podName:825bc295-b53d-4e6b-9c7e-ad30d2d38c65 nodeName:}" failed. No retries permitted until 2026-04-17 08:06:50.63310492 +0000 UTC m=+251.390582327 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs") pod "network-metrics-daemon-x6js9" (UID: "825bc295-b53d-4e6b-9c7e-ad30d2d38c65") : secret "metrics-daemon-secret" not found Apr 17 08:04:49.307550 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:49.307514 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" event={"ID":"daaeed21-ac53-4784-abb1-fc080fe469a9","Type":"ContainerStarted","Data":"e5306697689474d579f013b410cbd61da4076232121df6f18db1fe09005976b6"} Apr 17 08:04:49.307550 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:49.307555 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" event={"ID":"daaeed21-ac53-4784-abb1-fc080fe469a9","Type":"ContainerStarted","Data":"4dd73b2fab3ed743eadc2bd640b4874014905f92ccf1a417733f51bdd2e839c6"} Apr 17 08:04:49.324089 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:49.324041 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2z86d" podStartSLOduration=17.246926783 podStartE2EDuration="19.324027871s" podCreationTimestamp="2026-04-17 08:04:30 +0000 UTC" firstStartedPulling="2026-04-17 08:04:46.278531779 +0000 UTC m=+127.036009185" lastFinishedPulling="2026-04-17 08:04:48.355632863 +0000 UTC m=+129.113110273" observedRunningTime="2026-04-17 08:04:49.323371588 +0000 UTC m=+130.080849016" watchObservedRunningTime="2026-04-17 08:04:49.324027871 +0000 UTC m=+130.081505299" Apr 17 08:04:52.063473 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:52.063436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:52.065847 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:52.065825 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c639c32-2f50-4f3f-9ba1-c99215cb7e01-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-f85s5\" (UID: \"6c639c32-2f50-4f3f-9ba1-c99215cb7e01\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:52.333775 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:52.333685 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-r9xfp\"" Apr 17 08:04:52.341982 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:52.341957 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" Apr 17 08:04:52.456307 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:52.456275 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5"] Apr 17 08:04:52.460038 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:04:52.460005 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c639c32_2f50_4f3f_9ba1_c99215cb7e01.slice/crio-46a82ebd8d3e52cf6c91d410986cce5e77788ffa343d8c452880d7475b387617 WatchSource:0}: Error finding container 46a82ebd8d3e52cf6c91d410986cce5e77788ffa343d8c452880d7475b387617: Status 404 returned error can't find the container with id 46a82ebd8d3e52cf6c91d410986cce5e77788ffa343d8c452880d7475b387617 Apr 17 08:04:53.317863 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:53.317827 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" event={"ID":"6c639c32-2f50-4f3f-9ba1-c99215cb7e01","Type":"ContainerStarted","Data":"46a82ebd8d3e52cf6c91d410986cce5e77788ffa343d8c452880d7475b387617"} Apr 17 08:04:55.323498 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:55.323465 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" event={"ID":"6c639c32-2f50-4f3f-9ba1-c99215cb7e01","Type":"ContainerStarted","Data":"3c7ba6cb91505b39dc5fe3a32acfd448f216977d1eaae696731cb1096e1ee56f"} Apr 17 08:04:55.342355 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:55.342307 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-f85s5" podStartSLOduration=33.402884178 podStartE2EDuration="35.342292773s" podCreationTimestamp="2026-04-17 08:04:20 +0000 UTC" firstStartedPulling="2026-04-17 08:04:52.461835941 +0000 UTC m=+133.219313347" lastFinishedPulling="2026-04-17 08:04:54.401244536 +0000 UTC m=+135.158721942" observedRunningTime="2026-04-17 08:04:55.34162574 +0000 UTC m=+136.099103168" watchObservedRunningTime="2026-04-17 08:04:55.342292773 +0000 UTC m=+136.099770206" Apr 17 08:04:58.595856 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.595821 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j"] Apr 17 08:04:58.597969 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.597934 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" Apr 17 08:04:58.600808 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.600786 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 08:04:58.602155 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.602134 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 08:04:58.602155 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.602142 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-sm8wk\"" Apr 17 08:04:58.608683 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.608663 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j"] Apr 17 08:04:58.614953 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.614920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fee27ff2-c1bc-4b4a-ab6a-a22844376a8f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zjl9j\" (UID: \"fee27ff2-c1bc-4b4a-ab6a-a22844376a8f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" Apr 17 08:04:58.615068 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.615043 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fee27ff2-c1bc-4b4a-ab6a-a22844376a8f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zjl9j\" (UID: \"fee27ff2-c1bc-4b4a-ab6a-a22844376a8f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" Apr 17 08:04:58.710091 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.710056 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6"] Apr 17 08:04:58.712423 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.712403 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" Apr 17 08:04:58.715564 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.715539 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fee27ff2-c1bc-4b4a-ab6a-a22844376a8f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zjl9j\" (UID: \"fee27ff2-c1bc-4b4a-ab6a-a22844376a8f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" Apr 17 08:04:58.715659 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.715607 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/aa7fa9f7-d17e-4534-91e2-b7f035da1e65-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rn7l6\" (UID: \"aa7fa9f7-d17e-4534-91e2-b7f035da1e65\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" Apr 17 08:04:58.715708 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.715670 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fee27ff2-c1bc-4b4a-ab6a-a22844376a8f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zjl9j\" (UID: \"fee27ff2-c1bc-4b4a-ab6a-a22844376a8f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" Apr 17 08:04:58.715755 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.715720 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-k6zt6\"" Apr 17 08:04:58.715902 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.715881 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 08:04:58.716259 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.716236 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fee27ff2-c1bc-4b4a-ab6a-a22844376a8f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zjl9j\" (UID: \"fee27ff2-c1bc-4b4a-ab6a-a22844376a8f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" Apr 17 08:04:58.718050 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.718027 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fee27ff2-c1bc-4b4a-ab6a-a22844376a8f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zjl9j\" (UID: \"fee27ff2-c1bc-4b4a-ab6a-a22844376a8f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" Apr 17 08:04:58.731545 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.731520 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6"] Apr 17 08:04:58.735447 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.735427 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4pltb"] Apr 17 08:04:58.737857 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.737843 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.741114 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.741095 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5w4p6\"" Apr 17 08:04:58.741684 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.741666 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 08:04:58.741763 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.741686 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 08:04:58.750465 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.750443 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4pltb"] Apr 17 08:04:58.816259 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.816231 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6tdw\" (UniqueName: \"kubernetes.io/projected/febe2ce9-02a6-467c-836e-72a352ffead8-kube-api-access-r6tdw\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.816396 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.816272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/febe2ce9-02a6-467c-836e-72a352ffead8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.816396 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.816296 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/febe2ce9-02a6-467c-836e-72a352ffead8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.816396 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.816348 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/febe2ce9-02a6-467c-836e-72a352ffead8-crio-socket\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.816552 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.816438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/aa7fa9f7-d17e-4534-91e2-b7f035da1e65-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rn7l6\" (UID: \"aa7fa9f7-d17e-4534-91e2-b7f035da1e65\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" Apr 17 08:04:58.816552 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.816461 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/febe2ce9-02a6-467c-836e-72a352ffead8-data-volume\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.819408 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.819380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/aa7fa9f7-d17e-4534-91e2-b7f035da1e65-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rn7l6\" (UID: \"aa7fa9f7-d17e-4534-91e2-b7f035da1e65\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" Apr 17 08:04:58.906382 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.906313 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" Apr 17 08:04:58.917401 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.917380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/febe2ce9-02a6-467c-836e-72a352ffead8-data-volume\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.917520 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.917419 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6tdw\" (UniqueName: \"kubernetes.io/projected/febe2ce9-02a6-467c-836e-72a352ffead8-kube-api-access-r6tdw\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.917520 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.917448 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/febe2ce9-02a6-467c-836e-72a352ffead8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.917520 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.917480 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/febe2ce9-02a6-467c-836e-72a352ffead8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.917520 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.917516 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/febe2ce9-02a6-467c-836e-72a352ffead8-crio-socket\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.917724 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.917590 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/febe2ce9-02a6-467c-836e-72a352ffead8-crio-socket\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.917780 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.917743 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/febe2ce9-02a6-467c-836e-72a352ffead8-data-volume\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.918168 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.918143 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/febe2ce9-02a6-467c-836e-72a352ffead8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.919830 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.919809 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/febe2ce9-02a6-467c-836e-72a352ffead8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:58.925917 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:58.925894 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6tdw\" (UniqueName: \"kubernetes.io/projected/febe2ce9-02a6-467c-836e-72a352ffead8-kube-api-access-r6tdw\") pod \"insights-runtime-extractor-4pltb\" (UID: \"febe2ce9-02a6-467c-836e-72a352ffead8\") " pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:59.026692 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:59.026666 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" Apr 17 08:04:59.042675 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:59.042650 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j"] Apr 17 08:04:59.044916 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:04:59.044880 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee27ff2_c1bc_4b4a_ab6a_a22844376a8f.slice/crio-2776ef5bfa2ec18e472b55239a59ca4c300051dca121ed6ff168f03982d103b8 WatchSource:0}: Error finding container 2776ef5bfa2ec18e472b55239a59ca4c300051dca121ed6ff168f03982d103b8: Status 404 returned error can't find the container with id 2776ef5bfa2ec18e472b55239a59ca4c300051dca121ed6ff168f03982d103b8 Apr 17 08:04:59.045846 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:59.045824 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4pltb" Apr 17 08:04:59.150767 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:59.150737 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6"] Apr 17 08:04:59.153380 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:04:59.153351 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa7fa9f7_d17e_4534_91e2_b7f035da1e65.slice/crio-33293f25b3047b76b2d12b37c719f39637dc50b38562704f41d03cab2b1c779f WatchSource:0}: Error finding container 33293f25b3047b76b2d12b37c719f39637dc50b38562704f41d03cab2b1c779f: Status 404 returned error can't find the container with id 33293f25b3047b76b2d12b37c719f39637dc50b38562704f41d03cab2b1c779f Apr 17 08:04:59.166209 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:59.166185 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4pltb"] Apr 17 08:04:59.168752 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:04:59.168728 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfebe2ce9_02a6_467c_836e_72a352ffead8.slice/crio-136b63debbd491a7107e59d0af81d9ae57c82e5a1aa7d7fd949695416663847b WatchSource:0}: Error finding container 136b63debbd491a7107e59d0af81d9ae57c82e5a1aa7d7fd949695416663847b: Status 404 returned error can't find the container with id 136b63debbd491a7107e59d0af81d9ae57c82e5a1aa7d7fd949695416663847b Apr 17 08:04:59.335659 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:59.335624 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4pltb" event={"ID":"febe2ce9-02a6-467c-836e-72a352ffead8","Type":"ContainerStarted","Data":"89baf73141053de4dd00f330ea5ce4e0b149e14c0e86e37e6a377529fe09c789"} Apr 17 08:04:59.335800 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:59.335665 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4pltb" event={"ID":"febe2ce9-02a6-467c-836e-72a352ffead8","Type":"ContainerStarted","Data":"136b63debbd491a7107e59d0af81d9ae57c82e5a1aa7d7fd949695416663847b"} Apr 17 08:04:59.336731 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:59.336706 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" event={"ID":"aa7fa9f7-d17e-4534-91e2-b7f035da1e65","Type":"ContainerStarted","Data":"33293f25b3047b76b2d12b37c719f39637dc50b38562704f41d03cab2b1c779f"} Apr 17 08:04:59.337654 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:04:59.337636 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" event={"ID":"fee27ff2-c1bc-4b4a-ab6a-a22844376a8f","Type":"ContainerStarted","Data":"2776ef5bfa2ec18e472b55239a59ca4c300051dca121ed6ff168f03982d103b8"} Apr 17 08:05:00.341397 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:00.341359 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" event={"ID":"fee27ff2-c1bc-4b4a-ab6a-a22844376a8f","Type":"ContainerStarted","Data":"1144bb51ba136a258153b1f2dda0c523086e00fabee470b23eb90eedebafc38d"} Apr 17 08:05:00.342990 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:00.342966 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4pltb" event={"ID":"febe2ce9-02a6-467c-836e-72a352ffead8","Type":"ContainerStarted","Data":"309a8ef7e365e3f14da33afd3e69d6b77b1b0b3d915e0ac39e7eb26e3456a4ed"} Apr 17 08:05:00.344281 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:00.344248 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" event={"ID":"aa7fa9f7-d17e-4534-91e2-b7f035da1e65","Type":"ContainerStarted","Data":"5839394d7dbda306f17b941469da71ac1743602678d00aa22a669acd3c42a14f"} Apr 17 08:05:00.344510 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:00.344490 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" Apr 17 08:05:00.345736 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:00.345712 2570 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-57cf98b594-rn7l6 container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.133.0.16:8443/healthz\": dial tcp 10.133.0.16:8443: connect: connection refused" start-of-body= Apr 17 08:05:00.345841 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:00.345748 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" podUID="aa7fa9f7-d17e-4534-91e2-b7f035da1e65" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.133.0.16:8443/healthz\": dial tcp 10.133.0.16:8443: connect: connection refused" Apr 17 08:05:00.379909 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:00.379859 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" podStartSLOduration=1.2460934080000001 podStartE2EDuration="2.379842412s" podCreationTimestamp="2026-04-17 08:04:58 +0000 UTC" firstStartedPulling="2026-04-17 08:04:59.155216124 +0000 UTC m=+139.912693535" lastFinishedPulling="2026-04-17 08:05:00.288965121 +0000 UTC m=+141.046442539" observedRunningTime="2026-04-17 08:05:00.379008112 +0000 UTC m=+141.136485541" watchObservedRunningTime="2026-04-17 08:05:00.379842412 +0000 UTC m=+141.137319841" Apr 17 08:05:00.380078 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:00.380051 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zjl9j" podStartSLOduration=1.169597334 podStartE2EDuration="2.38004652s" podCreationTimestamp="2026-04-17 08:04:58 +0000 UTC" firstStartedPulling="2026-04-17 08:04:59.046878698 +0000 UTC m=+139.804356104" lastFinishedPulling="2026-04-17 08:05:00.257327884 +0000 UTC m=+141.014805290" observedRunningTime="2026-04-17 08:05:00.361723031 +0000 UTC m=+141.119200473" watchObservedRunningTime="2026-04-17 08:05:00.38004652 +0000 UTC m=+141.137523946" Apr 17 08:05:01.353578 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:01.353548 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rn7l6" Apr 17 08:05:02.350868 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:02.350828 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4pltb" event={"ID":"febe2ce9-02a6-467c-836e-72a352ffead8","Type":"ContainerStarted","Data":"bff7893c673ea721ffa8cb1cc212b01dc9786536427413d38e9daa699e67bfce"} Apr 17 08:05:02.369615 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:02.369563 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4pltb" podStartSLOduration=2.075421186 podStartE2EDuration="4.369550626s" podCreationTimestamp="2026-04-17 08:04:58 +0000 UTC" firstStartedPulling="2026-04-17 08:04:59.226030112 +0000 UTC m=+139.983507535" lastFinishedPulling="2026-04-17 08:05:01.520159568 +0000 UTC m=+142.277636975" observedRunningTime="2026-04-17 08:05:02.367661074 +0000 UTC m=+143.125138502" watchObservedRunningTime="2026-04-17 08:05:02.369550626 +0000 UTC m=+143.127028054" Apr 17 08:05:07.258748 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.258713 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh"] Apr 17 08:05:07.262039 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.262019 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.264728 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.264708 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 08:05:07.264845 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.264731 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 08:05:07.264845 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.264779 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 08:05:07.266163 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.266146 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-2f8bw\"" Apr 17 08:05:07.273821 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.273800 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh"] Apr 17 08:05:07.278223 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.278205 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-889nc"] Apr 17 08:05:07.280644 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.280628 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.283380 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.283359 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 08:05:07.283477 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.283378 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ddzmq\"" Apr 17 08:05:07.283477 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.283421 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 08:05:07.283567 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.283493 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 08:05:07.285875 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.285858 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zgs9j"] Apr 17 08:05:07.287662 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.287642 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.287749 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.287686 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgw9\" (UniqueName: \"kubernetes.io/projected/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-kube-api-access-9qgw9\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.287790 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.287753 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.287824 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.287816 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.288355 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.288341 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.291227 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.291204 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 08:05:07.291322 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.291244 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hwh7w\"" Apr 17 08:05:07.291414 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.291400 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 08:05:07.292019 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.292001 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 08:05:07.292185 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.292172 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-889nc"] Apr 17 08:05:07.388743 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.388710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.388743 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.388746 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbz4c\" (UniqueName: \"kubernetes.io/projected/9ca23c5d-aff9-4250-952e-3fe91b19a469-kube-api-access-tbz4c\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.388976 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.388769 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76d99ffc-5745-4855-aad4-1f77be2a9f22-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.388976 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.388801 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-wtmp\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.388976 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.388843 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ca23c5d-aff9-4250-952e-3fe91b19a469-sys\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.388976 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.388860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76d99ffc-5745-4855-aad4-1f77be2a9f22-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.388976 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:05:07.388870 2570 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 17 08:05:07.388976 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:05:07.388933 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-openshift-state-metrics-tls podName:c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1 nodeName:}" failed. No retries permitted until 2026-04-17 08:05:07.888915032 +0000 UTC m=+148.646392442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-gmhvh" (UID: "c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1") : secret "openshift-state-metrics-tls" not found Apr 17 08:05:07.389208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.388979 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.389208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389008 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9ca23c5d-aff9-4250-952e-3fe91b19a469-root\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.389208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389029 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-textfile\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.389208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389053 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.389208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389070 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.389208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389106 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ca23c5d-aff9-4250-952e-3fe91b19a469-metrics-client-ca\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.389208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389162 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlfd2\" (UniqueName: \"kubernetes.io/projected/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-api-access-vlfd2\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.389208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389190 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.389208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389208 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-tls\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.389571 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgw9\" (UniqueName: \"kubernetes.io/projected/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-kube-api-access-9qgw9\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.389571 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389257 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.389571 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389281 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-accelerators-collector-config\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.389571 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389316 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.389698 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.389607 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.391591 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.391570 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.399322 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.399300 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgw9\" (UniqueName: \"kubernetes.io/projected/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-kube-api-access-9qgw9\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.490030 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490001 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbz4c\" (UniqueName: \"kubernetes.io/projected/9ca23c5d-aff9-4250-952e-3fe91b19a469-kube-api-access-tbz4c\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490154 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490038 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76d99ffc-5745-4855-aad4-1f77be2a9f22-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.490214 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-wtmp\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490261 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490220 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ca23c5d-aff9-4250-952e-3fe91b19a469-sys\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490261 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490249 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76d99ffc-5745-4855-aad4-1f77be2a9f22-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.490347 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.490347 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490312 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ca23c5d-aff9-4250-952e-3fe91b19a469-sys\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490347 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490330 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-wtmp\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490493 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490313 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9ca23c5d-aff9-4250-952e-3fe91b19a469-root\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490493 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490353 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9ca23c5d-aff9-4250-952e-3fe91b19a469-root\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490493 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490378 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-textfile\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490493 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490406 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.490493 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ca23c5d-aff9-4250-952e-3fe91b19a469-metrics-client-ca\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490493 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlfd2\" (UniqueName: \"kubernetes.io/projected/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-api-access-vlfd2\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.490810 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490517 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-tls\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490810 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490555 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490810 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490587 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-accelerators-collector-config\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.490810 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490627 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.490810 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490652 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/76d99ffc-5745-4855-aad4-1f77be2a9f22-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.490810 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76d99ffc-5745-4855-aad4-1f77be2a9f22-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.490810 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.490762 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-textfile\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.491180 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.491094 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ca23c5d-aff9-4250-952e-3fe91b19a469-metrics-client-ca\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.491240 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.491183 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-accelerators-collector-config\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.491398 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.491377 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.493035 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.493010 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-tls\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.493123 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.493032 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9ca23c5d-aff9-4250-952e-3fe91b19a469-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.493332 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.493309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.493369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.493316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.497569 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.497547 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbz4c\" (UniqueName: \"kubernetes.io/projected/9ca23c5d-aff9-4250-952e-3fe91b19a469-kube-api-access-tbz4c\") pod \"node-exporter-zgs9j\" (UID: \"9ca23c5d-aff9-4250-952e-3fe91b19a469\") " pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.497817 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.497798 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlfd2\" (UniqueName: \"kubernetes.io/projected/76d99ffc-5745-4855-aad4-1f77be2a9f22-kube-api-access-vlfd2\") pod \"kube-state-metrics-69db897b98-889nc\" (UID: \"76d99ffc-5745-4855-aad4-1f77be2a9f22\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.589799 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.589724 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" Apr 17 08:05:07.597709 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.597633 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zgs9j" Apr 17 08:05:07.606354 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:05:07.606327 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca23c5d_aff9_4250_952e_3fe91b19a469.slice/crio-6ea1054bb23c23cfcd45699b5e443494f2c88b25549fa7b01671ea0d578eea0c WatchSource:0}: Error finding container 6ea1054bb23c23cfcd45699b5e443494f2c88b25549fa7b01671ea0d578eea0c: Status 404 returned error can't find the container with id 6ea1054bb23c23cfcd45699b5e443494f2c88b25549fa7b01671ea0d578eea0c Apr 17 08:05:07.712645 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.712612 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-889nc"] Apr 17 08:05:07.715412 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:05:07.715384 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d99ffc_5745_4855_aad4_1f77be2a9f22.slice/crio-2851e6c7420020308dec025e89de4b91eec14d8b3c9b93e93a55c42aa5f3c98f WatchSource:0}: Error finding container 2851e6c7420020308dec025e89de4b91eec14d8b3c9b93e93a55c42aa5f3c98f: Status 404 returned error can't find the container with id 2851e6c7420020308dec025e89de4b91eec14d8b3c9b93e93a55c42aa5f3c98f Apr 17 08:05:07.894722 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.894642 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:07.897595 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:07.897576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gmhvh\" (UID: \"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:08.170723 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.170636 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" Apr 17 08:05:08.316839 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.316799 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh"] Apr 17 08:05:08.319701 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:05:08.319665 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc278dd7a_99b8_425a_b7a7_f5cdcd2e0ce1.slice/crio-7c88154d2431396633a4c53025e1d858473e605baeb9a455e8ad8899f8b75f96 WatchSource:0}: Error finding container 7c88154d2431396633a4c53025e1d858473e605baeb9a455e8ad8899f8b75f96: Status 404 returned error can't find the container with id 7c88154d2431396633a4c53025e1d858473e605baeb9a455e8ad8899f8b75f96 Apr 17 08:05:08.367027 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.366990 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zgs9j" event={"ID":"9ca23c5d-aff9-4250-952e-3fe91b19a469","Type":"ContainerStarted","Data":"6ea1054bb23c23cfcd45699b5e443494f2c88b25549fa7b01671ea0d578eea0c"} Apr 17 08:05:08.368847 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.368807 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" event={"ID":"76d99ffc-5745-4855-aad4-1f77be2a9f22","Type":"ContainerStarted","Data":"2851e6c7420020308dec025e89de4b91eec14d8b3c9b93e93a55c42aa5f3c98f"} Apr 17 08:05:08.370105 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.370022 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" event={"ID":"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1","Type":"ContainerStarted","Data":"7c88154d2431396633a4c53025e1d858473e605baeb9a455e8ad8899f8b75f96"} Apr 17 08:05:08.379835 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.379811 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 08:05:08.388212 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.388187 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.391046 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.391022 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 08:05:08.391403 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.391387 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 08:05:08.391596 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.391581 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 08:05:08.391865 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.391850 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8gqgb\"" Apr 17 08:05:08.391917 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.391867 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 08:05:08.392104 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.392088 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 08:05:08.392169 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.392120 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 08:05:08.392303 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.392287 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 08:05:08.392359 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.392314 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 08:05:08.392555 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.392539 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 08:05:08.399326 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.399290 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 08:05:08.500910 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.500675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.500910 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.500757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-config-volume\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.500910 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.500802 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.500910 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.500859 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-config-out\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.501254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.500909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.501254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.500996 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7tp6\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-kube-api-access-j7tp6\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.501254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.501025 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.501254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.501053 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.501254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.501076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.501254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.501105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.501254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.501145 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.501254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.501178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.501254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.501217 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-web-config\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602251 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602251 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-config-volume\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602251 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602232 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602479 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602260 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-config-out\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602479 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602291 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602479 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602317 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7tp6\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-kube-api-access-j7tp6\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602479 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602347 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602479 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602372 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602479 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:05:08.602377 2570 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 17 08:05:08.602479 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602395 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602479 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602430 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602847 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:05:08.602494 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-main-tls podName:5ccb7968-cc78-410b-b649-4ac84c22e844 nodeName:}" failed. No retries permitted until 2026-04-17 08:05:09.102428328 +0000 UTC m=+149.859905734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844") : secret "alertmanager-main-tls" not found Apr 17 08:05:08.602847 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602548 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602847 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.602847 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.602630 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-web-config\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.604992 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.603832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.604992 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.604768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.605192 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.605167 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.606995 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.606969 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-config-volume\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.607651 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.607566 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.607736 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.607646 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-web-config\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.607736 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.607689 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.608339 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.608297 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.608520 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.608501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-config-out\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.609100 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.609075 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.609181 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.609091 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:08.613410 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:08.613388 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7tp6\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-kube-api-access-j7tp6\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:09.107183 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.107148 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:09.109502 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.109471 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:09.321307 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.321271 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:05:09.375000 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.374889 2570 generic.go:358] "Generic (PLEG): container finished" podID="9ca23c5d-aff9-4250-952e-3fe91b19a469" containerID="6f004dd7032d2908b16ced1be93f641df98edf02d783ad2ed08dcc4efeff5e53" exitCode=0 Apr 17 08:05:09.375139 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.375016 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zgs9j" event={"ID":"9ca23c5d-aff9-4250-952e-3fe91b19a469","Type":"ContainerDied","Data":"6f004dd7032d2908b16ced1be93f641df98edf02d783ad2ed08dcc4efeff5e53"} Apr 17 08:05:09.377464 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.377438 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" event={"ID":"76d99ffc-5745-4855-aad4-1f77be2a9f22","Type":"ContainerStarted","Data":"2a1ed5b225a2e1c239225a713a28988f75a3a2dd336ba7055b5c1cc2b1df9de6"} Apr 17 08:05:09.377562 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.377471 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" event={"ID":"76d99ffc-5745-4855-aad4-1f77be2a9f22","Type":"ContainerStarted","Data":"e4454ad7cbf6ba00360994dcb9c6236bcb4f2a18c50f2641a65ced7620daf7ff"} Apr 17 08:05:09.377562 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.377486 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" event={"ID":"76d99ffc-5745-4855-aad4-1f77be2a9f22","Type":"ContainerStarted","Data":"31b9bc8fe98b8f873a6c431b594df9b81bf284c2720fe9b26d8bf62c343fc30e"} Apr 17 08:05:09.382547 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.382298 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" event={"ID":"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1","Type":"ContainerStarted","Data":"d3d60f551a556970b2d54f5a39e6e7f29d38475cbab90d770f59292c133f218a"} Apr 17 08:05:09.382547 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.382328 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" event={"ID":"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1","Type":"ContainerStarted","Data":"f4d313ba3c19773c85ccfce638287428b3b0cf5298388719b31597c09c5a29b1"} Apr 17 08:05:09.424604 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.424547 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-889nc" podStartSLOduration=1.046590569 podStartE2EDuration="2.42452727s" podCreationTimestamp="2026-04-17 08:05:07 +0000 UTC" firstStartedPulling="2026-04-17 08:05:07.717395403 +0000 UTC m=+148.474872809" lastFinishedPulling="2026-04-17 08:05:09.095332098 +0000 UTC m=+149.852809510" observedRunningTime="2026-04-17 08:05:09.422363278 +0000 UTC m=+150.179840707" watchObservedRunningTime="2026-04-17 08:05:09.42452727 +0000 UTC m=+150.182004700" Apr 17 08:05:09.479406 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:09.479379 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 08:05:10.387739 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:10.387695 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zgs9j" event={"ID":"9ca23c5d-aff9-4250-952e-3fe91b19a469","Type":"ContainerStarted","Data":"a7bbaa3894aa8216c198a0c30c585cef804a16bae6f3b01dda1ac99e983507a8"} Apr 17 08:05:10.387739 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:10.387741 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zgs9j" event={"ID":"9ca23c5d-aff9-4250-952e-3fe91b19a469","Type":"ContainerStarted","Data":"a85a9cc41ce8d12c898b956fc9ff7c304e547711266847a70792188bc95b9052"} Apr 17 08:05:10.389012 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:10.388973 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerStarted","Data":"8e2d4cc3cf5ab5f7305bb152a09d21f63cf2801fa4d1f10bfff4eb99032ed297"} Apr 17 08:05:10.391254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:10.391214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" event={"ID":"c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1","Type":"ContainerStarted","Data":"8f59437083431e872bbe7eb740651f831d29f2b466cbb5f83ff06bb25df61a29"} Apr 17 08:05:10.409104 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:10.409057 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zgs9j" podStartSLOduration=2.4457532840000002 podStartE2EDuration="3.409043508s" podCreationTimestamp="2026-04-17 08:05:07 +0000 UTC" firstStartedPulling="2026-04-17 08:05:07.608409147 +0000 UTC m=+148.365886555" lastFinishedPulling="2026-04-17 08:05:08.571699368 +0000 UTC m=+149.329176779" observedRunningTime="2026-04-17 08:05:10.408101153 +0000 UTC m=+151.165578582" watchObservedRunningTime="2026-04-17 08:05:10.409043508 +0000 UTC m=+151.166520961" Apr 17 08:05:10.429482 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:10.429421 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gmhvh" podStartSLOduration=2.3091641640000002 podStartE2EDuration="3.429403242s" podCreationTimestamp="2026-04-17 08:05:07 +0000 UTC" firstStartedPulling="2026-04-17 08:05:08.671329617 +0000 UTC m=+149.428807031" lastFinishedPulling="2026-04-17 08:05:09.791568704 +0000 UTC m=+150.549046109" observedRunningTime="2026-04-17 08:05:10.427225227 +0000 UTC m=+151.184702677" watchObservedRunningTime="2026-04-17 08:05:10.429403242 +0000 UTC m=+151.186880671" Apr 17 08:05:11.395335 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.395299 2570 generic.go:358] "Generic (PLEG): container finished" podID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerID="faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937" exitCode=0 Apr 17 08:05:11.395718 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.395384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerDied","Data":"faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937"} Apr 17 08:05:11.676168 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.676090 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-55b96554df-2z2jz"] Apr 17 08:05:11.679188 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.679173 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.682260 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.682227 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 08:05:11.682396 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.682225 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 08:05:11.682396 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.682225 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 08:05:11.682396 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.682230 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 08:05:11.682396 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.682230 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1u260cl6cenkc\"" Apr 17 08:05:11.683506 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.683488 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-m5zdn\"" Apr 17 08:05:11.687755 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.687732 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55b96554df-2z2jz"] Apr 17 08:05:11.737026 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.736994 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b52aae27-b89c-4591-ae6b-d324992aef0c-metrics-server-audit-profiles\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.737179 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.737028 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b52aae27-b89c-4591-ae6b-d324992aef0c-secret-metrics-server-tls\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.737179 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.737165 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b52aae27-b89c-4591-ae6b-d324992aef0c-audit-log\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.737271 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.737199 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f4dp\" (UniqueName: \"kubernetes.io/projected/b52aae27-b89c-4591-ae6b-d324992aef0c-kube-api-access-2f4dp\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.737271 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.737226 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b52aae27-b89c-4591-ae6b-d324992aef0c-secret-metrics-server-client-certs\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.737271 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.737244 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b52aae27-b89c-4591-ae6b-d324992aef0c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.737366 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.737306 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52aae27-b89c-4591-ae6b-d324992aef0c-client-ca-bundle\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.837899 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.837853 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2f4dp\" (UniqueName: \"kubernetes.io/projected/b52aae27-b89c-4591-ae6b-d324992aef0c-kube-api-access-2f4dp\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.838100 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.837910 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b52aae27-b89c-4591-ae6b-d324992aef0c-secret-metrics-server-client-certs\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.838100 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.837934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b52aae27-b89c-4591-ae6b-d324992aef0c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.838100 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.837988 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52aae27-b89c-4591-ae6b-d324992aef0c-client-ca-bundle\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.838100 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.838033 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b52aae27-b89c-4591-ae6b-d324992aef0c-metrics-server-audit-profiles\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.838100 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.838060 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b52aae27-b89c-4591-ae6b-d324992aef0c-secret-metrics-server-tls\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.838368 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.838183 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b52aae27-b89c-4591-ae6b-d324992aef0c-audit-log\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.838603 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.838570 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/b52aae27-b89c-4591-ae6b-d324992aef0c-audit-log\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.838846 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.838813 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b52aae27-b89c-4591-ae6b-d324992aef0c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.839864 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.839839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/b52aae27-b89c-4591-ae6b-d324992aef0c-metrics-server-audit-profiles\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.840637 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.840616 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/b52aae27-b89c-4591-ae6b-d324992aef0c-secret-metrics-server-tls\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.840710 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.840680 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b52aae27-b89c-4591-ae6b-d324992aef0c-client-ca-bundle\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.840755 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.840706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/b52aae27-b89c-4591-ae6b-d324992aef0c-secret-metrics-server-client-certs\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.850017 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.849993 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f4dp\" (UniqueName: \"kubernetes.io/projected/b52aae27-b89c-4591-ae6b-d324992aef0c-kube-api-access-2f4dp\") pod \"metrics-server-55b96554df-2z2jz\" (UID: \"b52aae27-b89c-4591-ae6b-d324992aef0c\") " pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:11.989452 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:11.989374 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:12.136065 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.135934 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55b96554df-2z2jz"] Apr 17 08:05:12.139083 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:05:12.139049 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb52aae27_b89c_4591_ae6b_d324992aef0c.slice/crio-edf469b90756013ab90f7bc5fe9381bba065832fe898151a844d83ed28bb5538 WatchSource:0}: Error finding container edf469b90756013ab90f7bc5fe9381bba065832fe898151a844d83ed28bb5538: Status 404 returned error can't find the container with id edf469b90756013ab90f7bc5fe9381bba065832fe898151a844d83ed28bb5538 Apr 17 08:05:12.407802 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.407719 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" event={"ID":"b52aae27-b89c-4591-ae6b-d324992aef0c","Type":"ContainerStarted","Data":"edf469b90756013ab90f7bc5fe9381bba065832fe898151a844d83ed28bb5538"} Apr 17 08:05:12.449733 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.449690 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-9849b9fd7-vhprn"] Apr 17 08:05:12.454418 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.454392 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.458383 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.458361 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 08:05:12.458744 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.458721 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 08:05:12.458864 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.458741 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 08:05:12.458864 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.458766 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 08:05:12.459873 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.459849 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-hhc2x\"" Apr 17 08:05:12.460009 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.459852 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 08:05:12.476564 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.476525 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 08:05:12.477694 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.477668 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-9849b9fd7-vhprn"] Apr 17 08:05:12.544355 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.544317 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe2a37d6-7621-4fe0-933f-ddad7873f146-metrics-client-ca\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.544553 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.544397 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-telemeter-client-tls\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.544553 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.544463 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe2a37d6-7621-4fe0-933f-ddad7873f146-serving-certs-ca-bundle\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.544553 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.544505 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-federate-client-tls\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.544553 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.544538 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-secret-telemeter-client\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.544729 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.544618 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe2a37d6-7621-4fe0-933f-ddad7873f146-telemeter-trusted-ca-bundle\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.544729 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.544676 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85snj\" (UniqueName: \"kubernetes.io/projected/fe2a37d6-7621-4fe0-933f-ddad7873f146-kube-api-access-85snj\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.544729 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.544709 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.645891 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.645846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe2a37d6-7621-4fe0-933f-ddad7873f146-metrics-client-ca\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.646083 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.645970 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-telemeter-client-tls\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.646083 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.646030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe2a37d6-7621-4fe0-933f-ddad7873f146-serving-certs-ca-bundle\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.646208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.646081 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-federate-client-tls\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.646208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.646118 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-secret-telemeter-client\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.646208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.646151 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe2a37d6-7621-4fe0-933f-ddad7873f146-telemeter-trusted-ca-bundle\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.646208 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.646190 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85snj\" (UniqueName: \"kubernetes.io/projected/fe2a37d6-7621-4fe0-933f-ddad7873f146-kube-api-access-85snj\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.646399 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.646222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.647179 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.646688 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fe2a37d6-7621-4fe0-933f-ddad7873f146-metrics-client-ca\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.647179 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.647046 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe2a37d6-7621-4fe0-933f-ddad7873f146-serving-certs-ca-bundle\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.647909 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.647859 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe2a37d6-7621-4fe0-933f-ddad7873f146-telemeter-trusted-ca-bundle\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.649157 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.649130 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.649430 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.649399 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-secret-telemeter-client\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.649516 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.649471 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-telemeter-client-tls\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.649953 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.649916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fe2a37d6-7621-4fe0-933f-ddad7873f146-federate-client-tls\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.655672 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.655649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85snj\" (UniqueName: \"kubernetes.io/projected/fe2a37d6-7621-4fe0-933f-ddad7873f146-kube-api-access-85snj\") pod \"telemeter-client-9849b9fd7-vhprn\" (UID: \"fe2a37d6-7621-4fe0-933f-ddad7873f146\") " pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.765666 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.765631 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" Apr 17 08:05:12.921536 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:12.921507 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-9849b9fd7-vhprn"] Apr 17 08:05:12.925099 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:05:12.925065 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe2a37d6_7621_4fe0_933f_ddad7873f146.slice/crio-8596490b628d1a41d88db347f606d58249f40197eb20e741970cb0300a4c3586 WatchSource:0}: Error finding container 8596490b628d1a41d88db347f606d58249f40197eb20e741970cb0300a4c3586: Status 404 returned error can't find the container with id 8596490b628d1a41d88db347f606d58249f40197eb20e741970cb0300a4c3586 Apr 17 08:05:13.413845 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.413765 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerStarted","Data":"7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1"} Apr 17 08:05:13.413845 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.413807 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerStarted","Data":"5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24"} Apr 17 08:05:13.413845 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.413817 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerStarted","Data":"6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b"} Apr 17 08:05:13.413845 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.413826 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerStarted","Data":"452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c"} Apr 17 08:05:13.413845 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.413834 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerStarted","Data":"74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a"} Apr 17 08:05:13.415050 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.415017 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" event={"ID":"fe2a37d6-7621-4fe0-933f-ddad7873f146","Type":"ContainerStarted","Data":"8596490b628d1a41d88db347f606d58249f40197eb20e741970cb0300a4c3586"} Apr 17 08:05:13.456483 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.456448 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 08:05:13.460911 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.460883 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.464136 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464107 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 08:05:13.464247 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464169 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 08:05:13.464247 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464176 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 08:05:13.464247 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464196 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 08:05:13.464247 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464208 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-enrldquh127v0\"" Apr 17 08:05:13.464655 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464609 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-nhlc9\"" Apr 17 08:05:13.464760 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464681 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 08:05:13.464760 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464740 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 08:05:13.464864 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464807 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 08:05:13.464913 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.464853 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 08:05:13.465750 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.465694 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 08:05:13.465750 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.465712 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 08:05:13.465750 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.465697 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 08:05:13.468250 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.468226 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 08:05:13.473175 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.473154 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 08:05:13.554328 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554285 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554497 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554392 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554497 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554621 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554621 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554566 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554621 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554616 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554776 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554645 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-web-config\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554776 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554697 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554776 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554734 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554888 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554800 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config-out\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554888 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.554888 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554855 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.555036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554890 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.555036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554915 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.555036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.554951 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.555036 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.555018 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.555223 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.555108 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.555223 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.555153 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht45k\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-kube-api-access-ht45k\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656071 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656035 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656235 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656107 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config-out\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656235 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656138 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656235 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656161 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656235 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656235 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656240 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656281 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656333 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656384 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht45k\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-kube-api-access-ht45k\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656430 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656818 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656528 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656818 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656591 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656818 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656630 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656818 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656667 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-web-config\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.656818 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.656692 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.657853 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.657753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.658341 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.658001 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.659237 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.658863 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.659648 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.659375 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.661186 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.660849 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.661186 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.660955 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.661342 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.661316 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.661770 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.661744 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config-out\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.662299 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.662272 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.663211 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.662706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.663211 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.663175 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.663468 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.663413 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.663970 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.663900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.664683 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.664400 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-web-config\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.664683 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.664649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.664869 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.664846 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.665826 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.665797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.669067 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.669045 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht45k\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-kube-api-access-ht45k\") pod \"prometheus-k8s-0\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.772704 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.772661 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:13.989478 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:13.989440 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 08:05:13.994421 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:05:13.994367 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe2b7a6_23b5_4b49_8e94_86e6562f49bc.slice/crio-949f8e54edcb2ba1fd677c63e7c29edb0f349f9487ecd523e6bc9e6aa4143ac0 WatchSource:0}: Error finding container 949f8e54edcb2ba1fd677c63e7c29edb0f349f9487ecd523e6bc9e6aa4143ac0: Status 404 returned error can't find the container with id 949f8e54edcb2ba1fd677c63e7c29edb0f349f9487ecd523e6bc9e6aa4143ac0 Apr 17 08:05:14.420501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:14.420462 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerStarted","Data":"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009"} Apr 17 08:05:14.420501 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:14.420507 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerStarted","Data":"949f8e54edcb2ba1fd677c63e7c29edb0f349f9487ecd523e6bc9e6aa4143ac0"} Apr 17 08:05:14.422176 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:14.422148 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" event={"ID":"b52aae27-b89c-4591-ae6b-d324992aef0c","Type":"ContainerStarted","Data":"e1106dd4eec648e2ec90a334860d221cc9a4e0ac9b5f9efd5ddb8aeeeabe6489"} Apr 17 08:05:14.461395 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:14.461297 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" podStartSLOduration=1.7912869200000001 podStartE2EDuration="3.461282518s" podCreationTimestamp="2026-04-17 08:05:11 +0000 UTC" firstStartedPulling="2026-04-17 08:05:12.141322476 +0000 UTC m=+152.898799881" lastFinishedPulling="2026-04-17 08:05:13.811318067 +0000 UTC m=+154.568795479" observedRunningTime="2026-04-17 08:05:14.460793215 +0000 UTC m=+155.218270643" watchObservedRunningTime="2026-04-17 08:05:14.461282518 +0000 UTC m=+155.218759941" Apr 17 08:05:15.084075 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:05:15.084031 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" podUID="73989292-93a5-4241-9cd4-5833981ca4eb" Apr 17 08:05:15.098285 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:05:15.098259 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-252zk" podUID="421db932-74ef-4855-b174-a7ce6bca201b" Apr 17 08:05:15.107442 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:05:15.107403 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wp9g5" podUID="116a85c5-54d8-4462-9305-b1de37bca8cf" Apr 17 08:05:15.426584 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.426486 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" event={"ID":"fe2a37d6-7621-4fe0-933f-ddad7873f146","Type":"ContainerStarted","Data":"95db744e6780c000a1b58b9fffb007b81e5274ccc3f7c915696236ad5710e5b1"} Apr 17 08:05:15.426584 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.426532 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" event={"ID":"fe2a37d6-7621-4fe0-933f-ddad7873f146","Type":"ContainerStarted","Data":"a7db3ba95b41a7a8bd026b9b7b60e8fc88337af4782acd004415925144f77b9f"} Apr 17 08:05:15.426584 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.426549 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" event={"ID":"fe2a37d6-7621-4fe0-933f-ddad7873f146","Type":"ContainerStarted","Data":"460d23ca1aab2fa010722575318f609e348ef83e5da9aa03161757b5ba2de944"} Apr 17 08:05:15.429244 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.429214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerStarted","Data":"47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7"} Apr 17 08:05:15.430513 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.430488 2570 generic.go:358] "Generic (PLEG): container finished" podID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerID="55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009" exitCode=0 Apr 17 08:05:15.430624 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.430569 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerDied","Data":"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009"} Apr 17 08:05:15.430624 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.430610 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-252zk" Apr 17 08:05:15.430712 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.430637 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:05:15.449161 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.449121 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-9849b9fd7-vhprn" podStartSLOduration=1.662045013 podStartE2EDuration="3.449108164s" podCreationTimestamp="2026-04-17 08:05:12 +0000 UTC" firstStartedPulling="2026-04-17 08:05:12.92727555 +0000 UTC m=+153.684753186" lastFinishedPulling="2026-04-17 08:05:14.714338928 +0000 UTC m=+155.471816337" observedRunningTime="2026-04-17 08:05:15.448079791 +0000 UTC m=+156.205557241" watchObservedRunningTime="2026-04-17 08:05:15.449108164 +0000 UTC m=+156.206585591" Apr 17 08:05:15.498804 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:15.498751 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.315825986 podStartE2EDuration="7.49873713s" podCreationTimestamp="2026-04-17 08:05:08 +0000 UTC" firstStartedPulling="2026-04-17 08:05:09.487218535 +0000 UTC m=+150.244695943" lastFinishedPulling="2026-04-17 08:05:14.670129665 +0000 UTC m=+155.427607087" observedRunningTime="2026-04-17 08:05:15.496624689 +0000 UTC m=+156.254102110" watchObservedRunningTime="2026-04-17 08:05:15.49873713 +0000 UTC m=+156.256214559" Apr 17 08:05:16.896123 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:05:16.896063 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-x6js9" podUID="825bc295-b53d-4e6b-9c7e-ad30d2d38c65" Apr 17 08:05:19.445131 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:19.445099 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerStarted","Data":"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241"} Apr 17 08:05:19.445474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:19.445138 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerStarted","Data":"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7"} Apr 17 08:05:20.027424 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.027385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:05:20.027625 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.027438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:05:20.027625 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.027489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:05:20.030252 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.030222 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/421db932-74ef-4855-b174-a7ce6bca201b-metrics-tls\") pod \"dns-default-252zk\" (UID: \"421db932-74ef-4855-b174-a7ce6bca201b\") " pod="openshift-dns/dns-default-252zk" Apr 17 08:05:20.030378 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.030278 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"image-registry-64cb969cfb-fb4ps\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:05:20.030445 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.030385 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/116a85c5-54d8-4462-9305-b1de37bca8cf-cert\") pod \"ingress-canary-wp9g5\" (UID: \"116a85c5-54d8-4462-9305-b1de37bca8cf\") " pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:05:20.234270 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.234235 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5m5pl\"" Apr 17 08:05:20.234270 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.234248 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wdhxs\"" Apr 17 08:05:20.241744 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.241700 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:05:20.241744 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.241731 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-252zk" Apr 17 08:05:20.446834 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.446749 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-252zk"] Apr 17 08:05:20.449342 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:05:20.449316 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod421db932_74ef_4855_b174_a7ce6bca201b.slice/crio-ab910704ed5aae4455b8cd89ef443f8f14f5baea9c7ed1cd148994240de200cf WatchSource:0}: Error finding container ab910704ed5aae4455b8cd89ef443f8f14f5baea9c7ed1cd148994240de200cf: Status 404 returned error can't find the container with id ab910704ed5aae4455b8cd89ef443f8f14f5baea9c7ed1cd148994240de200cf Apr 17 08:05:20.468004 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:20.467978 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64cb969cfb-fb4ps"] Apr 17 08:05:21.455426 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.455340 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerStarted","Data":"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f"} Apr 17 08:05:21.455426 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.455385 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerStarted","Data":"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66"} Apr 17 08:05:21.455426 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.455399 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerStarted","Data":"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a"} Apr 17 08:05:21.455426 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.455412 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerStarted","Data":"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6"} Apr 17 08:05:21.456495 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.456471 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-252zk" event={"ID":"421db932-74ef-4855-b174-a7ce6bca201b","Type":"ContainerStarted","Data":"ab910704ed5aae4455b8cd89ef443f8f14f5baea9c7ed1cd148994240de200cf"} Apr 17 08:05:21.457595 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.457567 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" event={"ID":"73989292-93a5-4241-9cd4-5833981ca4eb","Type":"ContainerStarted","Data":"b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4"} Apr 17 08:05:21.457678 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.457601 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" event={"ID":"73989292-93a5-4241-9cd4-5833981ca4eb","Type":"ContainerStarted","Data":"292f98c30c0a529292623a4467593f387850a172e6c19d3b1869c2283cfee662"} Apr 17 08:05:21.457726 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.457691 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:05:21.484303 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.484106 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.54290505 podStartE2EDuration="8.484086908s" podCreationTimestamp="2026-04-17 08:05:13 +0000 UTC" firstStartedPulling="2026-04-17 08:05:15.431620385 +0000 UTC m=+156.189097792" lastFinishedPulling="2026-04-17 08:05:20.372802229 +0000 UTC m=+161.130279650" observedRunningTime="2026-04-17 08:05:21.482262641 +0000 UTC m=+162.239740091" watchObservedRunningTime="2026-04-17 08:05:21.484086908 +0000 UTC m=+162.241564336" Apr 17 08:05:21.500521 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:21.500469 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" podStartSLOduration=163.500452443 podStartE2EDuration="2m43.500452443s" podCreationTimestamp="2026-04-17 08:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:05:21.499181289 +0000 UTC m=+162.256658728" watchObservedRunningTime="2026-04-17 08:05:21.500452443 +0000 UTC m=+162.257929908" Apr 17 08:05:23.468525 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:23.468482 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-252zk" event={"ID":"421db932-74ef-4855-b174-a7ce6bca201b","Type":"ContainerStarted","Data":"05c5eca69da1c2618baf77b598550938d5af682c8956751a944e4bca7641eb60"} Apr 17 08:05:23.468525 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:23.468527 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-252zk" event={"ID":"421db932-74ef-4855-b174-a7ce6bca201b","Type":"ContainerStarted","Data":"6b7bd6041942c8036d7ed182908fed0dcb4e2b5d0528cf6b6ecc3b571283590a"} Apr 17 08:05:23.468965 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:23.468612 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-252zk" Apr 17 08:05:23.484263 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:23.484213 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-252zk" podStartSLOduration=129.329626449 podStartE2EDuration="2m11.484197921s" podCreationTimestamp="2026-04-17 08:03:12 +0000 UTC" firstStartedPulling="2026-04-17 08:05:20.451706478 +0000 UTC m=+161.209183899" lastFinishedPulling="2026-04-17 08:05:22.606277962 +0000 UTC m=+163.363755371" observedRunningTime="2026-04-17 08:05:23.483803948 +0000 UTC m=+164.241281371" watchObservedRunningTime="2026-04-17 08:05:23.484197921 +0000 UTC m=+164.241675354" Apr 17 08:05:23.773410 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:23.773380 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:05:28.882913 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:28.882862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:05:28.885694 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:28.885671 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x9bwm\"" Apr 17 08:05:28.893985 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:28.893968 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wp9g5" Apr 17 08:05:29.011267 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:29.011241 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wp9g5"] Apr 17 08:05:29.013222 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:05:29.013190 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod116a85c5_54d8_4462_9305_b1de37bca8cf.slice/crio-16b2a423ac4ca545e31e7ef004d7523f83e4417135e31b601761875816db871d WatchSource:0}: Error finding container 16b2a423ac4ca545e31e7ef004d7523f83e4417135e31b601761875816db871d: Status 404 returned error can't find the container with id 16b2a423ac4ca545e31e7ef004d7523f83e4417135e31b601761875816db871d Apr 17 08:05:29.487413 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:29.487374 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wp9g5" event={"ID":"116a85c5-54d8-4462-9305-b1de37bca8cf","Type":"ContainerStarted","Data":"16b2a423ac4ca545e31e7ef004d7523f83e4417135e31b601761875816db871d"} Apr 17 08:05:30.882680 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:30.882656 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:05:31.493881 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:31.493842 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wp9g5" event={"ID":"116a85c5-54d8-4462-9305-b1de37bca8cf","Type":"ContainerStarted","Data":"e85f355c23fa585d8429b862b9ca17386a86024a284ef30abc688054767d6f55"} Apr 17 08:05:31.509758 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:31.509694 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wp9g5" podStartSLOduration=137.69484502 podStartE2EDuration="2m19.509679706s" podCreationTimestamp="2026-04-17 08:03:12 +0000 UTC" firstStartedPulling="2026-04-17 08:05:29.015200638 +0000 UTC m=+169.772678045" lastFinishedPulling="2026-04-17 08:05:30.830035324 +0000 UTC m=+171.587512731" observedRunningTime="2026-04-17 08:05:31.508374148 +0000 UTC m=+172.265851579" watchObservedRunningTime="2026-04-17 08:05:31.509679706 +0000 UTC m=+172.267157133" Apr 17 08:05:31.989751 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:31.989715 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:31.990163 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:31.989764 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:33.473880 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:33.473844 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-252zk" Apr 17 08:05:39.518069 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:39.518034 2570 generic.go:358] "Generic (PLEG): container finished" podID="29f0e625-d1d8-412e-8a62-f3d9c9c33c3e" containerID="d1e9cae1a4f122a885ba55ca2a6bdbb538b1dd844ea7a9f6bf8c35db1321c6cc" exitCode=0 Apr 17 08:05:39.518438 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:39.518108 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vwznj" event={"ID":"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e","Type":"ContainerDied","Data":"d1e9cae1a4f122a885ba55ca2a6bdbb538b1dd844ea7a9f6bf8c35db1321c6cc"} Apr 17 08:05:39.518484 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:39.518448 2570 scope.go:117] "RemoveContainer" containerID="d1e9cae1a4f122a885ba55ca2a6bdbb538b1dd844ea7a9f6bf8c35db1321c6cc" Apr 17 08:05:40.246678 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:40.246626 2570 patch_prober.go:28] interesting pod/image-registry-64cb969cfb-fb4ps container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 08:05:40.246870 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:40.246701 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" podUID="73989292-93a5-4241-9cd4-5833981ca4eb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 08:05:40.523035 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:40.523000 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vwznj" event={"ID":"29f0e625-d1d8-412e-8a62-f3d9c9c33c3e","Type":"ContainerStarted","Data":"1d159742a42b2298cb898ae419f4ca8bdd0d2dda679f9084a9733bc2df68d945"} Apr 17 08:05:40.836539 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:40.836461 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-252zk_421db932-74ef-4855-b174-a7ce6bca201b/dns/0.log" Apr 17 08:05:41.036676 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:41.036645 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-252zk_421db932-74ef-4855-b174-a7ce6bca201b/kube-rbac-proxy/0.log" Apr 17 08:05:42.036180 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:42.036151 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cqc8h_e63f1d9b-13b7-4099-ad63-64b33b70f697/dns-node-resolver/0.log" Apr 17 08:05:42.466372 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:42.466300 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:05:42.637159 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:42.637129 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-64cb969cfb-fb4ps_73989292-93a5-4241-9cd4-5833981ca4eb/registry/0.log" Apr 17 08:05:43.436635 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:43.436607 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-skwnw_2c3085fe-841e-4ff9-aa63-90a0b035c240/node-ca/0.log" Apr 17 08:05:44.236684 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:44.236650 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wp9g5_116a85c5-54d8-4462-9305-b1de37bca8cf/serve-healthcheck-canary/0.log" Apr 17 08:05:49.130109 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:49.130073 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64cb969cfb-fb4ps"] Apr 17 08:05:50.555910 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:50.555873 2570 generic.go:358] "Generic (PLEG): container finished" podID="c9d36857-0992-41ad-aa34-4e41c08ace48" containerID="667190fc7c0578db12977b7d2b4fc820931437af8ab5d42a8dbd1965ec5b7888" exitCode=0 Apr 17 08:05:50.556394 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:50.555901 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" event={"ID":"c9d36857-0992-41ad-aa34-4e41c08ace48","Type":"ContainerDied","Data":"667190fc7c0578db12977b7d2b4fc820931437af8ab5d42a8dbd1965ec5b7888"} Apr 17 08:05:50.556394 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:50.556326 2570 scope.go:117] "RemoveContainer" containerID="667190fc7c0578db12977b7d2b4fc820931437af8ab5d42a8dbd1965ec5b7888" Apr 17 08:05:51.561404 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:51.561368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-csrrb" event={"ID":"c9d36857-0992-41ad-aa34-4e41c08ace48","Type":"ContainerStarted","Data":"2116c96b6c3df84c65f9d98d5c6390f1058a1739b7a7396e4bc7ca556d838801"} Apr 17 08:05:51.995542 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:51.995470 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:05:52.000022 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:05:51.999995 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-55b96554df-2z2jz" Apr 17 08:06:13.773320 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:13.773274 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:13.793105 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:13.793076 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:14.155494 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:14.155390 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" podUID="73989292-93a5-4241-9cd4-5833981ca4eb" containerName="registry" containerID="cri-o://b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4" gracePeriod=30 Apr 17 08:06:14.649167 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:14.649142 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:15.398307 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.398283 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:06:15.455364 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.455333 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") pod \"73989292-93a5-4241-9cd4-5833981ca4eb\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " Apr 17 08:06:15.455534 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.455383 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-installation-pull-secrets\") pod \"73989292-93a5-4241-9cd4-5833981ca4eb\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " Apr 17 08:06:15.455534 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.455415 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-trusted-ca\") pod \"73989292-93a5-4241-9cd4-5833981ca4eb\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " Apr 17 08:06:15.455534 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.455458 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73989292-93a5-4241-9cd4-5833981ca4eb-ca-trust-extracted\") pod \"73989292-93a5-4241-9cd4-5833981ca4eb\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " Apr 17 08:06:15.455534 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.455530 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdddd\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-kube-api-access-vdddd\") pod \"73989292-93a5-4241-9cd4-5833981ca4eb\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " Apr 17 08:06:15.455748 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.455558 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-bound-sa-token\") pod \"73989292-93a5-4241-9cd4-5833981ca4eb\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " Apr 17 08:06:15.455748 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.455603 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-image-registry-private-configuration\") pod \"73989292-93a5-4241-9cd4-5833981ca4eb\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " Apr 17 08:06:15.455748 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.455635 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-registry-certificates\") pod \"73989292-93a5-4241-9cd4-5833981ca4eb\" (UID: \"73989292-93a5-4241-9cd4-5833981ca4eb\") " Apr 17 08:06:15.455965 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.455920 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "73989292-93a5-4241-9cd4-5833981ca4eb" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:15.456263 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.456226 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "73989292-93a5-4241-9cd4-5833981ca4eb" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:15.458380 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.458307 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "73989292-93a5-4241-9cd4-5833981ca4eb" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:15.458380 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.458322 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "73989292-93a5-4241-9cd4-5833981ca4eb" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:15.458534 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.458362 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "73989292-93a5-4241-9cd4-5833981ca4eb" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:15.458534 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.458430 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-kube-api-access-vdddd" (OuterVolumeSpecName: "kube-api-access-vdddd") pod "73989292-93a5-4241-9cd4-5833981ca4eb" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb"). InnerVolumeSpecName "kube-api-access-vdddd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:15.458637 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.458598 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "73989292-93a5-4241-9cd4-5833981ca4eb" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:15.466391 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.466364 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73989292-93a5-4241-9cd4-5833981ca4eb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "73989292-93a5-4241-9cd4-5833981ca4eb" (UID: "73989292-93a5-4241-9cd4-5833981ca4eb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:15.556803 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.556761 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-trusted-ca\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:15.556803 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.556799 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/73989292-93a5-4241-9cd4-5833981ca4eb-ca-trust-extracted\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:15.556803 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.556810 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdddd\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-kube-api-access-vdddd\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:15.556803 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.556819 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-bound-sa-token\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:15.557060 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.556829 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-image-registry-private-configuration\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:15.557060 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.556840 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/73989292-93a5-4241-9cd4-5833981ca4eb-registry-certificates\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:15.557060 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.556849 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/73989292-93a5-4241-9cd4-5833981ca4eb-registry-tls\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:15.557060 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.556858 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/73989292-93a5-4241-9cd4-5833981ca4eb-installation-pull-secrets\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:15.637452 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.637373 2570 generic.go:358] "Generic (PLEG): container finished" podID="73989292-93a5-4241-9cd4-5833981ca4eb" containerID="b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4" exitCode=0 Apr 17 08:06:15.637452 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.637434 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" Apr 17 08:06:15.637605 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.637458 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" event={"ID":"73989292-93a5-4241-9cd4-5833981ca4eb","Type":"ContainerDied","Data":"b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4"} Apr 17 08:06:15.637605 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.637498 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64cb969cfb-fb4ps" event={"ID":"73989292-93a5-4241-9cd4-5833981ca4eb","Type":"ContainerDied","Data":"292f98c30c0a529292623a4467593f387850a172e6c19d3b1869c2283cfee662"} Apr 17 08:06:15.637605 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.637515 2570 scope.go:117] "RemoveContainer" containerID="b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4" Apr 17 08:06:15.645977 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.645959 2570 scope.go:117] "RemoveContainer" containerID="b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4" Apr 17 08:06:15.646224 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:15.646202 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4\": container with ID starting with b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4 not found: ID does not exist" containerID="b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4" Apr 17 08:06:15.646302 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.646229 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4"} err="failed to get container status \"b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4\": rpc error: code = NotFound desc = could not find container \"b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4\": container with ID starting with b671bbc4e1df4d4323b9949bd70dce30268154ea5ba341b9f186350faf157cf4 not found: ID does not exist" Apr 17 08:06:15.658965 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.658923 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64cb969cfb-fb4ps"] Apr 17 08:06:15.661888 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.661868 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-64cb969cfb-fb4ps"] Apr 17 08:06:15.888216 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:15.888097 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73989292-93a5-4241-9cd4-5833981ca4eb" path="/var/lib/kubelet/pods/73989292-93a5-4241-9cd4-5833981ca4eb/volumes" Apr 17 08:06:27.575296 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:27.575260 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 08:06:27.575725 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:27.575695 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="alertmanager" containerID="cri-o://74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a" gracePeriod=120 Apr 17 08:06:27.575800 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:27.575737 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy-metric" containerID="cri-o://7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1" gracePeriod=120 Apr 17 08:06:27.575800 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:27.575760 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy-web" containerID="cri-o://6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b" gracePeriod=120 Apr 17 08:06:27.575891 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:27.575801 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="prom-label-proxy" containerID="cri-o://47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7" gracePeriod=120 Apr 17 08:06:27.575891 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:27.575846 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="config-reloader" containerID="cri-o://452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c" gracePeriod=120 Apr 17 08:06:27.576008 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:27.575800 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy" containerID="cri-o://5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24" gracePeriod=120 Apr 17 08:06:28.680984 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.680927 2570 generic.go:358] "Generic (PLEG): container finished" podID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerID="47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7" exitCode=0 Apr 17 08:06:28.680984 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.680983 2570 generic.go:358] "Generic (PLEG): container finished" podID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerID="5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24" exitCode=0 Apr 17 08:06:28.680984 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.680992 2570 generic.go:358] "Generic (PLEG): container finished" podID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerID="452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c" exitCode=0 Apr 17 08:06:28.681406 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.680999 2570 generic.go:358] "Generic (PLEG): container finished" podID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerID="74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a" exitCode=0 Apr 17 08:06:28.681406 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.680992 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerDied","Data":"47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7"} Apr 17 08:06:28.681406 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.681036 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerDied","Data":"5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24"} Apr 17 08:06:28.681406 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.681052 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerDied","Data":"452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c"} Apr 17 08:06:28.681406 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.681066 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerDied","Data":"74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a"} Apr 17 08:06:28.823276 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.823253 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:28.878849 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.878821 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-config-volume\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879021 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.878864 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-main-tls\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879021 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.878928 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7tp6\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-kube-api-access-j7tp6\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879021 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.878966 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-tls-assets\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879021 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.878993 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-web\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879240 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.879025 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-cluster-tls-config\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879240 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.879076 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-web-config\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879240 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.879108 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-metric\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879240 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.879149 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879240 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.879175 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-trusted-ca-bundle\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879240 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.879202 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-metrics-client-ca\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879492 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.879254 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-main-db\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.879492 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.879285 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-config-out\") pod \"5ccb7968-cc78-410b-b649-4ac84c22e844\" (UID: \"5ccb7968-cc78-410b-b649-4ac84c22e844\") " Apr 17 08:06:28.880165 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.880108 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:28.880165 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.880140 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:28.880474 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.880333 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:28.882486 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.882461 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-kube-api-access-j7tp6" (OuterVolumeSpecName: "kube-api-access-j7tp6") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "kube-api-access-j7tp6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:28.883696 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.883254 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-config-volume" (OuterVolumeSpecName: "config-volume") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:28.883696 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.883608 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:28.883696 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.883645 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-config-out" (OuterVolumeSpecName: "config-out") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:28.883978 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.883746 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:28.884063 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.884046 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:28.884345 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.884318 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:28.884345 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.884328 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:28.888570 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.888415 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:28.895727 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.895669 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-web-config" (OuterVolumeSpecName: "web-config") pod "5ccb7968-cc78-410b-b649-4ac84c22e844" (UID: "5ccb7968-cc78-410b-b649-4ac84c22e844"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:28.980278 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980251 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-config-out\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980278 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980276 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-config-volume\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980287 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-main-tls\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980298 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j7tp6\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-kube-api-access-j7tp6\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980306 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5ccb7968-cc78-410b-b649-4ac84c22e844-tls-assets\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980316 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980325 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-cluster-tls-config\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980333 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-web-config\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980344 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980353 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5ccb7968-cc78-410b-b649-4ac84c22e844-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980362 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980370 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ccb7968-cc78-410b-b649-4ac84c22e844-metrics-client-ca\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:28.980419 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:28.980378 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5ccb7968-cc78-410b-b649-4ac84c22e844-alertmanager-main-db\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:29.686323 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.686291 2570 generic.go:358] "Generic (PLEG): container finished" podID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerID="7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1" exitCode=0 Apr 17 08:06:29.686323 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.686318 2570 generic.go:358] "Generic (PLEG): container finished" podID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerID="6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b" exitCode=0 Apr 17 08:06:29.686732 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.686370 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerDied","Data":"7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1"} Apr 17 08:06:29.686732 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.686413 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.686732 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.686429 2570 scope.go:117] "RemoveContainer" containerID="47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7" Apr 17 08:06:29.686732 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.686415 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerDied","Data":"6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b"} Apr 17 08:06:29.686732 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.686551 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5ccb7968-cc78-410b-b649-4ac84c22e844","Type":"ContainerDied","Data":"8e2d4cc3cf5ab5f7305bb152a09d21f63cf2801fa4d1f10bfff4eb99032ed297"} Apr 17 08:06:29.696392 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.696372 2570 scope.go:117] "RemoveContainer" containerID="7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1" Apr 17 08:06:29.703241 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.703224 2570 scope.go:117] "RemoveContainer" containerID="5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24" Apr 17 08:06:29.710031 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.710009 2570 scope.go:117] "RemoveContainer" containerID="6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b" Apr 17 08:06:29.711529 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.711507 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 08:06:29.714336 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.714316 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 08:06:29.717335 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.717315 2570 scope.go:117] "RemoveContainer" containerID="452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c" Apr 17 08:06:29.723435 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.723417 2570 scope.go:117] "RemoveContainer" containerID="74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a" Apr 17 08:06:29.730389 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.730371 2570 scope.go:117] "RemoveContainer" containerID="faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937" Apr 17 08:06:29.737143 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.737127 2570 scope.go:117] "RemoveContainer" containerID="47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7" Apr 17 08:06:29.737404 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:29.737382 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7\": container with ID starting with 47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7 not found: ID does not exist" containerID="47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7" Apr 17 08:06:29.737450 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.737416 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7"} err="failed to get container status \"47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7\": rpc error: code = NotFound desc = could not find container \"47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7\": container with ID starting with 47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7 not found: ID does not exist" Apr 17 08:06:29.737450 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.737443 2570 scope.go:117] "RemoveContainer" containerID="7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1" Apr 17 08:06:29.737684 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:29.737658 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1\": container with ID starting with 7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1 not found: ID does not exist" containerID="7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1" Apr 17 08:06:29.737684 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.737672 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 08:06:29.737753 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.737685 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1"} err="failed to get container status \"7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1\": rpc error: code = NotFound desc = could not find container \"7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1\": container with ID starting with 7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1 not found: ID does not exist" Apr 17 08:06:29.737753 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.737709 2570 scope.go:117] "RemoveContainer" containerID="5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24" Apr 17 08:06:29.737980 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:29.737960 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24\": container with ID starting with 5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24 not found: ID does not exist" containerID="5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24" Apr 17 08:06:29.738042 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.737989 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24"} err="failed to get container status \"5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24\": rpc error: code = NotFound desc = could not find container \"5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24\": container with ID starting with 5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24 not found: ID does not exist" Apr 17 08:06:29.738042 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738006 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy-web" Apr 17 08:06:29.738042 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738010 2570 scope.go:117] "RemoveContainer" containerID="6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b" Apr 17 08:06:29.738042 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738019 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy-web" Apr 17 08:06:29.738042 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738034 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="prom-label-proxy" Apr 17 08:06:29.738042 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738040 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="prom-label-proxy" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738050 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="config-reloader" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738056 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="config-reloader" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738064 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738069 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738078 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="init-config-reloader" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738084 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="init-config-reloader" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738093 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73989292-93a5-4241-9cd4-5833981ca4eb" containerName="registry" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738098 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="73989292-93a5-4241-9cd4-5833981ca4eb" containerName="registry" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738104 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="alertmanager" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738110 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="alertmanager" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738117 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy-metric" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738122 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy-metric" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738167 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738176 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy-web" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738184 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="alertmanager" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738190 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="config-reloader" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738197 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="73989292-93a5-4241-9cd4-5833981ca4eb" containerName="registry" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738203 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="kube-rbac-proxy-metric" Apr 17 08:06:29.738246 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738209 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" containerName="prom-label-proxy" Apr 17 08:06:29.738866 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:29.738248 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b\": container with ID starting with 6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b not found: ID does not exist" containerID="6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b" Apr 17 08:06:29.738866 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738272 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b"} err="failed to get container status \"6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b\": rpc error: code = NotFound desc = could not find container \"6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b\": container with ID starting with 6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b not found: ID does not exist" Apr 17 08:06:29.738866 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738294 2570 scope.go:117] "RemoveContainer" containerID="452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c" Apr 17 08:06:29.738866 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:29.738528 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c\": container with ID starting with 452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c not found: ID does not exist" containerID="452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c" Apr 17 08:06:29.738866 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738552 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c"} err="failed to get container status \"452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c\": rpc error: code = NotFound desc = could not find container \"452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c\": container with ID starting with 452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c not found: ID does not exist" Apr 17 08:06:29.738866 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738572 2570 scope.go:117] "RemoveContainer" containerID="74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a" Apr 17 08:06:29.738866 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:29.738821 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a\": container with ID starting with 74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a not found: ID does not exist" containerID="74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a" Apr 17 08:06:29.738866 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738847 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a"} err="failed to get container status \"74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a\": rpc error: code = NotFound desc = could not find container \"74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a\": container with ID starting with 74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a not found: ID does not exist" Apr 17 08:06:29.738866 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.738862 2570 scope.go:117] "RemoveContainer" containerID="faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937" Apr 17 08:06:29.739263 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:29.739112 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937\": container with ID starting with faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937 not found: ID does not exist" containerID="faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937" Apr 17 08:06:29.739263 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.739142 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937"} err="failed to get container status \"faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937\": rpc error: code = NotFound desc = could not find container \"faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937\": container with ID starting with faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937 not found: ID does not exist" Apr 17 08:06:29.739263 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.739165 2570 scope.go:117] "RemoveContainer" containerID="47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7" Apr 17 08:06:29.739426 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.739407 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7"} err="failed to get container status \"47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7\": rpc error: code = NotFound desc = could not find container \"47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7\": container with ID starting with 47010a5fadaaef2b66c22b5b7653210cfb782967da57fce8b292e0d830b9b8c7 not found: ID does not exist" Apr 17 08:06:29.739473 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.739429 2570 scope.go:117] "RemoveContainer" containerID="7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1" Apr 17 08:06:29.739657 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.739640 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1"} err="failed to get container status \"7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1\": rpc error: code = NotFound desc = could not find container \"7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1\": container with ID starting with 7e6ffb48b9945d29cbceefbe82670b4d4ad8a41754b06be965ff7bbd1dcbcee1 not found: ID does not exist" Apr 17 08:06:29.739731 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.739669 2570 scope.go:117] "RemoveContainer" containerID="5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24" Apr 17 08:06:29.739895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.739874 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24"} err="failed to get container status \"5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24\": rpc error: code = NotFound desc = could not find container \"5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24\": container with ID starting with 5c325a8c85adb496a1ce66e5b49765a31a414a97c67ed5a87eafb89e65499d24 not found: ID does not exist" Apr 17 08:06:29.739959 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.739896 2570 scope.go:117] "RemoveContainer" containerID="6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b" Apr 17 08:06:29.740173 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.740156 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b"} err="failed to get container status \"6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b\": rpc error: code = NotFound desc = could not find container \"6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b\": container with ID starting with 6c7c2c9493929ef70746ac86769dc87a453323c5466dcc1ecd03d75dccb8c44b not found: ID does not exist" Apr 17 08:06:29.740242 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.740174 2570 scope.go:117] "RemoveContainer" containerID="452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c" Apr 17 08:06:29.740442 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.740425 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c"} err="failed to get container status \"452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c\": rpc error: code = NotFound desc = could not find container \"452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c\": container with ID starting with 452def4d92f2dab13f354d2404cb0b0b9fdf6226b6fedd665e85ab3f837a106c not found: ID does not exist" Apr 17 08:06:29.740487 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.740443 2570 scope.go:117] "RemoveContainer" containerID="74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a" Apr 17 08:06:29.740627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.740613 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a"} err="failed to get container status \"74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a\": rpc error: code = NotFound desc = could not find container \"74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a\": container with ID starting with 74ffdbeff2b8e97361a5af7662b9d9e1b10b0bcdcec61a6a5d5a7b1b48f6bb4a not found: ID does not exist" Apr 17 08:06:29.740673 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.740627 2570 scope.go:117] "RemoveContainer" containerID="faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937" Apr 17 08:06:29.740821 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.740804 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937"} err="failed to get container status \"faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937\": rpc error: code = NotFound desc = could not find container \"faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937\": container with ID starting with faf32824a7e293d31285d9805ecdd67f46954fc35fe4e40b38411a6fabda2937 not found: ID does not exist" Apr 17 08:06:29.742458 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.742443 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.745586 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.745435 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 08:06:29.745586 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.745450 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 08:06:29.745586 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.745476 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 08:06:29.745586 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.745450 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 08:06:29.745586 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.745487 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 08:06:29.745586 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.745513 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 08:06:29.745586 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.745482 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 08:06:29.746016 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.745861 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-8gqgb\"" Apr 17 08:06:29.746016 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.745949 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 08:06:29.752731 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.752705 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 08:06:29.756436 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.756415 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 08:06:29.886959 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.886898 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ccb7968-cc78-410b-b649-4ac84c22e844" path="/var/lib/kubelet/pods/5ccb7968-cc78-410b-b649-4ac84c22e844/volumes" Apr 17 08:06:29.887770 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.887745 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.887902 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.887786 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.887902 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.887817 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-config-volume\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.887902 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.887885 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c30c34aa-650e-46d7-9669-07dd9872db26-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.888062 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.887949 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.888062 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.887979 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c30c34aa-650e-46d7-9669-07dd9872db26-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.888062 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.888019 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c30c34aa-650e-46d7-9669-07dd9872db26-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.888062 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.888040 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-web-config\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.888234 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.888069 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.888234 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.888111 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c34aa-650e-46d7-9669-07dd9872db26-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.888234 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.888140 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.888234 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.888192 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrjjm\" (UniqueName: \"kubernetes.io/projected/c30c34aa-650e-46d7-9669-07dd9872db26-kube-api-access-vrjjm\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.888392 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.888238 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c30c34aa-650e-46d7-9669-07dd9872db26-config-out\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.988677 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.988602 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c30c34aa-650e-46d7-9669-07dd9872db26-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.988677 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.988643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.988677 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.988674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c30c34aa-650e-46d7-9669-07dd9872db26-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.988924 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.988713 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c30c34aa-650e-46d7-9669-07dd9872db26-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.988924 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.988735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-web-config\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989057 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.988960 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989057 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.989021 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c30c34aa-650e-46d7-9669-07dd9872db26-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989057 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.989034 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c34aa-650e-46d7-9669-07dd9872db26-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989219 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.989096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989219 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.989134 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrjjm\" (UniqueName: \"kubernetes.io/projected/c30c34aa-650e-46d7-9669-07dd9872db26-kube-api-access-vrjjm\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989219 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.989198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c30c34aa-650e-46d7-9669-07dd9872db26-config-out\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989356 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.989240 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989356 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.989278 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989356 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.989307 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-config-volume\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.989873 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.989846 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c30c34aa-650e-46d7-9669-07dd9872db26-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.990153 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.990127 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c30c34aa-650e-46d7-9669-07dd9872db26-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.991753 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.991722 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c30c34aa-650e-46d7-9669-07dd9872db26-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.991845 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.991803 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.992457 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.992187 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c30c34aa-650e-46d7-9669-07dd9872db26-config-out\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.992561 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.992457 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.992561 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.992518 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-web-config\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.992694 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.992673 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-config-volume\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.992748 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.992673 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.993179 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.993158 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.993560 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.993545 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c30c34aa-650e-46d7-9669-07dd9872db26-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:29.998147 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:29.998129 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrjjm\" (UniqueName: \"kubernetes.io/projected/c30c34aa-650e-46d7-9669-07dd9872db26-kube-api-access-vrjjm\") pod \"alertmanager-main-0\" (UID: \"c30c34aa-650e-46d7-9669-07dd9872db26\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:30.054169 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:30.054139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 08:06:30.184963 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:30.184907 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 08:06:30.189274 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:06:30.189238 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30c34aa_650e_46d7_9669_07dd9872db26.slice/crio-7cb361e3cb7b66544aa3964424528d32b8f9501c40e59aab400b4a4a457fc7d9 WatchSource:0}: Error finding container 7cb361e3cb7b66544aa3964424528d32b8f9501c40e59aab400b4a4a457fc7d9: Status 404 returned error can't find the container with id 7cb361e3cb7b66544aa3964424528d32b8f9501c40e59aab400b4a4a457fc7d9 Apr 17 08:06:30.691003 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:30.690965 2570 generic.go:358] "Generic (PLEG): container finished" podID="c30c34aa-650e-46d7-9669-07dd9872db26" containerID="4f886922f0a642d70648328026077360e7b005942594993cd8e1490994e03a5b" exitCode=0 Apr 17 08:06:30.691388 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:30.691000 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c34aa-650e-46d7-9669-07dd9872db26","Type":"ContainerDied","Data":"4f886922f0a642d70648328026077360e7b005942594993cd8e1490994e03a5b"} Apr 17 08:06:30.691388 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:30.691040 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c34aa-650e-46d7-9669-07dd9872db26","Type":"ContainerStarted","Data":"7cb361e3cb7b66544aa3964424528d32b8f9501c40e59aab400b4a4a457fc7d9"} Apr 17 08:06:31.698930 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.698899 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c34aa-650e-46d7-9669-07dd9872db26","Type":"ContainerStarted","Data":"a18fc1cfabb27cce28fba6400f74e9699acf5070e3adc66d52dc89d922e38f6d"} Apr 17 08:06:31.698930 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.698934 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c34aa-650e-46d7-9669-07dd9872db26","Type":"ContainerStarted","Data":"ccd85f1d440bb75a92a8d4034fbd4ef3798941a2d4e30939a1012eaee11f80bb"} Apr 17 08:06:31.698930 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.698957 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c34aa-650e-46d7-9669-07dd9872db26","Type":"ContainerStarted","Data":"9b82776394b4c75fccb6d1ad76e797790f384aaacf4276f41c7e95f3ec46d349"} Apr 17 08:06:31.699347 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.698965 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c34aa-650e-46d7-9669-07dd9872db26","Type":"ContainerStarted","Data":"7bd184a81b438d6fead0f2db6e6570794a95541ab8ad9bc87dc0cde042cf3eab"} Apr 17 08:06:31.699347 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.698974 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c34aa-650e-46d7-9669-07dd9872db26","Type":"ContainerStarted","Data":"c202132e58432ba024c176123c652fdfb292d0ab299469f7c68fcf75a7933838"} Apr 17 08:06:31.699347 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.698983 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c30c34aa-650e-46d7-9669-07dd9872db26","Type":"ContainerStarted","Data":"d247cd9af3c34a04d568102d5e5ac76df25974248bca03a4cee80d932a95a64d"} Apr 17 08:06:31.728218 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.728170 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.728151382 podStartE2EDuration="2.728151382s" podCreationTimestamp="2026-04-17 08:06:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:06:31.725511887 +0000 UTC m=+232.482989325" watchObservedRunningTime="2026-04-17 08:06:31.728151382 +0000 UTC m=+232.485628813" Apr 17 08:06:31.804133 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.804098 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 08:06:31.804612 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.804563 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="thanos-sidecar" containerID="cri-o://ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" gracePeriod=600 Apr 17 08:06:31.804612 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.804580 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy-web" containerID="cri-o://35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" gracePeriod=600 Apr 17 08:06:31.804612 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.804594 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="config-reloader" containerID="cri-o://6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" gracePeriod=600 Apr 17 08:06:31.804874 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.804587 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy-thanos" containerID="cri-o://13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" gracePeriod=600 Apr 17 08:06:31.804874 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.804577 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy" containerID="cri-o://bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" gracePeriod=600 Apr 17 08:06:31.804874 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:31.804561 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="prometheus" containerID="cri-o://3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" gracePeriod=600 Apr 17 08:06:32.044826 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.044803 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.106163 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106129 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-trusted-ca-bundle\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106163 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106167 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106397 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106186 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-serving-certs-ca-bundle\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106397 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106214 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-rulefiles-0\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106494 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106396 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-web-config\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106494 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106461 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-metrics-client-certs\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106594 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106512 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-kube-rbac-proxy\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106594 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106546 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106594 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106575 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-grpc-tls\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106726 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106612 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-db\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106726 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106646 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:32.106726 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106659 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:32.106726 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106652 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-tls\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106919 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106723 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht45k\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-kube-api-access-ht45k\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106919 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106757 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-metrics-client-ca\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106919 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106788 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-kubelet-serving-ca-bundle\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106919 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106856 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-tls-assets\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106919 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106882 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config-out\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.106919 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106908 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.107204 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.106932 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-thanos-prometheus-http-client-file\") pod \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\" (UID: \"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc\") " Apr 17 08:06:32.107308 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.107291 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.108642 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.108360 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.108642 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.108001 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:32.108642 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.108324 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:32.109024 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.109002 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:32.109493 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.109270 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:32.109693 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.109655 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:32.109823 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.109699 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 08:06:32.110660 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.110628 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:32.110774 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.110724 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:32.110981 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.110933 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:32.111251 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.111219 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:32.111523 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.111498 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config-out" (OuterVolumeSpecName: "config-out") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:06:32.111719 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.111673 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:32.111787 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.111752 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:32.112322 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.112303 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config" (OuterVolumeSpecName: "config") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:32.112420 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.112401 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-kube-api-access-ht45k" (OuterVolumeSpecName: "kube-api-access-ht45k") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "kube-api-access-ht45k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:06:32.121372 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.121351 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-web-config" (OuterVolumeSpecName: "web-config") pod "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" (UID: "0fe2b7a6-23b5-4b49-8e94-86e6562f49bc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 08:06:32.209348 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209300 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209348 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209340 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209348 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209354 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-web-config\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209367 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-metrics-client-certs\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209380 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-kube-rbac-proxy\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209393 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209406 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-grpc-tls\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209417 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-prometheus-k8s-db\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209431 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209442 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ht45k\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-kube-api-access-ht45k\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209457 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-metrics-client-ca\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209470 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209482 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-tls-assets\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209494 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config-out\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209505 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-config\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.209627 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.209517 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:06:32.707787 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707749 2570 generic.go:358] "Generic (PLEG): container finished" podID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerID="13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" exitCode=0 Apr 17 08:06:32.707787 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707778 2570 generic.go:358] "Generic (PLEG): container finished" podID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerID="bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" exitCode=0 Apr 17 08:06:32.707787 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707789 2570 generic.go:358] "Generic (PLEG): container finished" podID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerID="35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" exitCode=0 Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707797 2570 generic.go:358] "Generic (PLEG): container finished" podID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerID="ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" exitCode=0 Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707806 2570 generic.go:358] "Generic (PLEG): container finished" podID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerID="6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" exitCode=0 Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707815 2570 generic.go:358] "Generic (PLEG): container finished" podID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerID="3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" exitCode=0 Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707835 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerDied","Data":"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f"} Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707873 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerDied","Data":"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66"} Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707888 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerDied","Data":"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a"} Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707890 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707898 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerDied","Data":"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6"} Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerDied","Data":"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241"} Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707917 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerDied","Data":"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7"} Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707930 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0fe2b7a6-23b5-4b49-8e94-86e6562f49bc","Type":"ContainerDied","Data":"949f8e54edcb2ba1fd677c63e7c29edb0f349f9487ecd523e6bc9e6aa4143ac0"} Apr 17 08:06:32.708248 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.707928 2570 scope.go:117] "RemoveContainer" containerID="13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" Apr 17 08:06:32.715513 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.715492 2570 scope.go:117] "RemoveContainer" containerID="bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" Apr 17 08:06:32.722398 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.722383 2570 scope.go:117] "RemoveContainer" containerID="35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" Apr 17 08:06:32.728931 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.728910 2570 scope.go:117] "RemoveContainer" containerID="ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" Apr 17 08:06:32.732770 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.732749 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 08:06:32.735808 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.735787 2570 scope.go:117] "RemoveContainer" containerID="6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" Apr 17 08:06:32.740477 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.740458 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 08:06:32.742599 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.742584 2570 scope.go:117] "RemoveContainer" containerID="3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" Apr 17 08:06:32.749078 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.749055 2570 scope.go:117] "RemoveContainer" containerID="55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009" Apr 17 08:06:32.755489 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.755475 2570 scope.go:117] "RemoveContainer" containerID="13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" Apr 17 08:06:32.755742 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:32.755725 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": container with ID starting with 13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f not found: ID does not exist" containerID="13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" Apr 17 08:06:32.755781 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.755751 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f"} err="failed to get container status \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": rpc error: code = NotFound desc = could not find container \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": container with ID starting with 13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f not found: ID does not exist" Apr 17 08:06:32.755781 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.755768 2570 scope.go:117] "RemoveContainer" containerID="bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" Apr 17 08:06:32.756004 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:32.755983 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": container with ID starting with bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66 not found: ID does not exist" containerID="bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" Apr 17 08:06:32.756087 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756016 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66"} err="failed to get container status \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": rpc error: code = NotFound desc = could not find container \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": container with ID starting with bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66 not found: ID does not exist" Apr 17 08:06:32.756087 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756041 2570 scope.go:117] "RemoveContainer" containerID="35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" Apr 17 08:06:32.756294 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:32.756278 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": container with ID starting with 35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a not found: ID does not exist" containerID="35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" Apr 17 08:06:32.756341 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756298 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a"} err="failed to get container status \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": rpc error: code = NotFound desc = could not find container \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": container with ID starting with 35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a not found: ID does not exist" Apr 17 08:06:32.756341 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756311 2570 scope.go:117] "RemoveContainer" containerID="ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" Apr 17 08:06:32.756517 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:32.756501 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": container with ID starting with ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6 not found: ID does not exist" containerID="ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" Apr 17 08:06:32.756558 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756520 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6"} err="failed to get container status \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": rpc error: code = NotFound desc = could not find container \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": container with ID starting with ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6 not found: ID does not exist" Apr 17 08:06:32.756558 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756534 2570 scope.go:117] "RemoveContainer" containerID="6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" Apr 17 08:06:32.756716 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:32.756700 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": container with ID starting with 6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241 not found: ID does not exist" containerID="6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" Apr 17 08:06:32.756755 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756720 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241"} err="failed to get container status \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": rpc error: code = NotFound desc = could not find container \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": container with ID starting with 6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241 not found: ID does not exist" Apr 17 08:06:32.756755 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756733 2570 scope.go:117] "RemoveContainer" containerID="3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" Apr 17 08:06:32.756908 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:32.756894 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": container with ID starting with 3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7 not found: ID does not exist" containerID="3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" Apr 17 08:06:32.756959 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756911 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7"} err="failed to get container status \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": rpc error: code = NotFound desc = could not find container \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": container with ID starting with 3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7 not found: ID does not exist" Apr 17 08:06:32.756959 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.756922 2570 scope.go:117] "RemoveContainer" containerID="55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009" Apr 17 08:06:32.757120 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:06:32.757104 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": container with ID starting with 55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009 not found: ID does not exist" containerID="55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009" Apr 17 08:06:32.757159 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757125 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009"} err="failed to get container status \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": rpc error: code = NotFound desc = could not find container \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": container with ID starting with 55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009 not found: ID does not exist" Apr 17 08:06:32.757159 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757139 2570 scope.go:117] "RemoveContainer" containerID="13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" Apr 17 08:06:32.757351 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757332 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f"} err="failed to get container status \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": rpc error: code = NotFound desc = could not find container \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": container with ID starting with 13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f not found: ID does not exist" Apr 17 08:06:32.757395 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757351 2570 scope.go:117] "RemoveContainer" containerID="bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" Apr 17 08:06:32.757547 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757525 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66"} err="failed to get container status \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": rpc error: code = NotFound desc = could not find container \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": container with ID starting with bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66 not found: ID does not exist" Apr 17 08:06:32.757547 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757545 2570 scope.go:117] "RemoveContainer" containerID="35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" Apr 17 08:06:32.757746 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757727 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a"} err="failed to get container status \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": rpc error: code = NotFound desc = could not find container \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": container with ID starting with 35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a not found: ID does not exist" Apr 17 08:06:32.757788 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757748 2570 scope.go:117] "RemoveContainer" containerID="ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" Apr 17 08:06:32.757979 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757963 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6"} err="failed to get container status \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": rpc error: code = NotFound desc = could not find container \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": container with ID starting with ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6 not found: ID does not exist" Apr 17 08:06:32.758020 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.757979 2570 scope.go:117] "RemoveContainer" containerID="6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" Apr 17 08:06:32.758221 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.758202 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241"} err="failed to get container status \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": rpc error: code = NotFound desc = could not find container \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": container with ID starting with 6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241 not found: ID does not exist" Apr 17 08:06:32.758273 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.758222 2570 scope.go:117] "RemoveContainer" containerID="3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" Apr 17 08:06:32.758452 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.758429 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7"} err="failed to get container status \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": rpc error: code = NotFound desc = could not find container \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": container with ID starting with 3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7 not found: ID does not exist" Apr 17 08:06:32.758490 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.758455 2570 scope.go:117] "RemoveContainer" containerID="55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009" Apr 17 08:06:32.758683 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.758663 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009"} err="failed to get container status \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": rpc error: code = NotFound desc = could not find container \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": container with ID starting with 55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009 not found: ID does not exist" Apr 17 08:06:32.758721 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.758684 2570 scope.go:117] "RemoveContainer" containerID="13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" Apr 17 08:06:32.758894 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.758876 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f"} err="failed to get container status \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": rpc error: code = NotFound desc = could not find container \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": container with ID starting with 13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f not found: ID does not exist" Apr 17 08:06:32.758955 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.758895 2570 scope.go:117] "RemoveContainer" containerID="bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" Apr 17 08:06:32.759095 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759078 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66"} err="failed to get container status \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": rpc error: code = NotFound desc = could not find container \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": container with ID starting with bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66 not found: ID does not exist" Apr 17 08:06:32.759135 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759096 2570 scope.go:117] "RemoveContainer" containerID="35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" Apr 17 08:06:32.759299 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759285 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a"} err="failed to get container status \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": rpc error: code = NotFound desc = could not find container \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": container with ID starting with 35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a not found: ID does not exist" Apr 17 08:06:32.759299 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759298 2570 scope.go:117] "RemoveContainer" containerID="ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" Apr 17 08:06:32.759483 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759466 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6"} err="failed to get container status \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": rpc error: code = NotFound desc = could not find container \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": container with ID starting with ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6 not found: ID does not exist" Apr 17 08:06:32.759529 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759483 2570 scope.go:117] "RemoveContainer" containerID="6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" Apr 17 08:06:32.759663 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759649 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241"} err="failed to get container status \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": rpc error: code = NotFound desc = could not find container \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": container with ID starting with 6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241 not found: ID does not exist" Apr 17 08:06:32.759715 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759663 2570 scope.go:117] "RemoveContainer" containerID="3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" Apr 17 08:06:32.759829 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759813 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7"} err="failed to get container status \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": rpc error: code = NotFound desc = could not find container \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": container with ID starting with 3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7 not found: ID does not exist" Apr 17 08:06:32.759889 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.759830 2570 scope.go:117] "RemoveContainer" containerID="55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009" Apr 17 08:06:32.760065 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760047 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009"} err="failed to get container status \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": rpc error: code = NotFound desc = could not find container \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": container with ID starting with 55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009 not found: ID does not exist" Apr 17 08:06:32.760125 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760066 2570 scope.go:117] "RemoveContainer" containerID="13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" Apr 17 08:06:32.760288 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760270 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f"} err="failed to get container status \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": rpc error: code = NotFound desc = could not find container \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": container with ID starting with 13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f not found: ID does not exist" Apr 17 08:06:32.760326 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760289 2570 scope.go:117] "RemoveContainer" containerID="bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" Apr 17 08:06:32.760506 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760485 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66"} err="failed to get container status \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": rpc error: code = NotFound desc = could not find container \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": container with ID starting with bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66 not found: ID does not exist" Apr 17 08:06:32.760565 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760509 2570 scope.go:117] "RemoveContainer" containerID="35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" Apr 17 08:06:32.760703 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760687 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a"} err="failed to get container status \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": rpc error: code = NotFound desc = could not find container \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": container with ID starting with 35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a not found: ID does not exist" Apr 17 08:06:32.760745 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760704 2570 scope.go:117] "RemoveContainer" containerID="ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" Apr 17 08:06:32.760926 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760907 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6"} err="failed to get container status \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": rpc error: code = NotFound desc = could not find container \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": container with ID starting with ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6 not found: ID does not exist" Apr 17 08:06:32.761004 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.760928 2570 scope.go:117] "RemoveContainer" containerID="6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" Apr 17 08:06:32.761138 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.761123 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241"} err="failed to get container status \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": rpc error: code = NotFound desc = could not find container \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": container with ID starting with 6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241 not found: ID does not exist" Apr 17 08:06:32.761175 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.761138 2570 scope.go:117] "RemoveContainer" containerID="3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" Apr 17 08:06:32.761326 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.761311 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7"} err="failed to get container status \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": rpc error: code = NotFound desc = could not find container \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": container with ID starting with 3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7 not found: ID does not exist" Apr 17 08:06:32.761364 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.761327 2570 scope.go:117] "RemoveContainer" containerID="55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009" Apr 17 08:06:32.761577 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.761552 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009"} err="failed to get container status \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": rpc error: code = NotFound desc = could not find container \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": container with ID starting with 55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009 not found: ID does not exist" Apr 17 08:06:32.761634 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.761578 2570 scope.go:117] "RemoveContainer" containerID="13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" Apr 17 08:06:32.761911 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.761889 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f"} err="failed to get container status \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": rpc error: code = NotFound desc = could not find container \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": container with ID starting with 13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f not found: ID does not exist" Apr 17 08:06:32.761911 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.761910 2570 scope.go:117] "RemoveContainer" containerID="bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" Apr 17 08:06:32.762392 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.762364 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66"} err="failed to get container status \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": rpc error: code = NotFound desc = could not find container \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": container with ID starting with bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66 not found: ID does not exist" Apr 17 08:06:32.762470 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.762392 2570 scope.go:117] "RemoveContainer" containerID="35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" Apr 17 08:06:32.762658 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.762631 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a"} err="failed to get container status \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": rpc error: code = NotFound desc = could not find container \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": container with ID starting with 35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a not found: ID does not exist" Apr 17 08:06:32.762727 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.762662 2570 scope.go:117] "RemoveContainer" containerID="ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" Apr 17 08:06:32.762874 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.762854 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6"} err="failed to get container status \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": rpc error: code = NotFound desc = could not find container \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": container with ID starting with ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6 not found: ID does not exist" Apr 17 08:06:32.762931 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.762875 2570 scope.go:117] "RemoveContainer" containerID="6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" Apr 17 08:06:32.763132 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.763114 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241"} err="failed to get container status \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": rpc error: code = NotFound desc = could not find container \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": container with ID starting with 6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241 not found: ID does not exist" Apr 17 08:06:32.763183 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.763132 2570 scope.go:117] "RemoveContainer" containerID="3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" Apr 17 08:06:32.763358 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.763338 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7"} err="failed to get container status \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": rpc error: code = NotFound desc = could not find container \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": container with ID starting with 3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7 not found: ID does not exist" Apr 17 08:06:32.763404 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.763359 2570 scope.go:117] "RemoveContainer" containerID="55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009" Apr 17 08:06:32.763595 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.763576 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009"} err="failed to get container status \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": rpc error: code = NotFound desc = could not find container \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": container with ID starting with 55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009 not found: ID does not exist" Apr 17 08:06:32.763666 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.763598 2570 scope.go:117] "RemoveContainer" containerID="13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f" Apr 17 08:06:32.763797 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.763779 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 08:06:32.763854 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.763796 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f"} err="failed to get container status \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": rpc error: code = NotFound desc = could not find container \"13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f\": container with ID starting with 13fbe2c6a3fb15ddc8e16d09c098c1a379419671a680de9ace86ee8f8c91cf9f not found: ID does not exist" Apr 17 08:06:32.763854 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.763816 2570 scope.go:117] "RemoveContainer" containerID="bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66" Apr 17 08:06:32.764037 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764014 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66"} err="failed to get container status \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": rpc error: code = NotFound desc = could not find container \"bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66\": container with ID starting with bb5ae6f7a0d94fc5262547d63d070d96ee680cfcb4fb5e2ebfd4923d75bafa66 not found: ID does not exist" Apr 17 08:06:32.764118 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764039 2570 scope.go:117] "RemoveContainer" containerID="35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a" Apr 17 08:06:32.764222 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764206 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy-web" Apr 17 08:06:32.764297 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764223 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy-web" Apr 17 08:06:32.764297 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764239 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy" Apr 17 08:06:32.764297 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764246 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy" Apr 17 08:06:32.764297 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764247 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a"} err="failed to get container status \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": rpc error: code = NotFound desc = could not find container \"35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a\": container with ID starting with 35ec47ab811a7bac290829272c6b2187296d1095c117b84c03a9664571bf0a1a not found: ID does not exist" Apr 17 08:06:32.764297 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764264 2570 scope.go:117] "RemoveContainer" containerID="ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764256 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="thanos-sidecar" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764330 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="thanos-sidecar" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764345 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy-thanos" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764353 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy-thanos" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764362 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="config-reloader" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764367 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="config-reloader" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764390 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="init-config-reloader" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764398 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="init-config-reloader" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764407 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="prometheus" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764414 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="prometheus" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764490 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="thanos-sidecar" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764502 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy-thanos" Apr 17 08:06:32.764504 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764510 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="config-reloader" Apr 17 08:06:32.764877 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764517 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="prometheus" Apr 17 08:06:32.764877 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764510 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6"} err="failed to get container status \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": rpc error: code = NotFound desc = could not find container \"ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6\": container with ID starting with ce55b491e4a53d4765a00bdc76d228324ae39b3f75bc71088ac2225724c491c6 not found: ID does not exist" Apr 17 08:06:32.764877 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764535 2570 scope.go:117] "RemoveContainer" containerID="6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241" Apr 17 08:06:32.764877 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764524 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy-web" Apr 17 08:06:32.764877 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764581 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" containerName="kube-rbac-proxy" Apr 17 08:06:32.764877 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764745 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241"} err="failed to get container status \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": rpc error: code = NotFound desc = could not find container \"6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241\": container with ID starting with 6479f2222aabc1255e7c94c6297fb9c7fc3feea5d13eb3b7919e6b83a9bf6241 not found: ID does not exist" Apr 17 08:06:32.764877 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764763 2570 scope.go:117] "RemoveContainer" containerID="3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7" Apr 17 08:06:32.765160 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.764997 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7"} err="failed to get container status \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": rpc error: code = NotFound desc = could not find container \"3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7\": container with ID starting with 3a623c2715979d9d4afa868f9874ad1d2f4a28b72b097f3e63fad5fafbca4fd7 not found: ID does not exist" Apr 17 08:06:32.765160 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.765013 2570 scope.go:117] "RemoveContainer" containerID="55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009" Apr 17 08:06:32.765228 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.765201 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009"} err="failed to get container status \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": rpc error: code = NotFound desc = could not find container \"55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009\": container with ID starting with 55ae34fd01c8abb037557f1c45b0790eba63772fd225f1b7dd54484b2126a009 not found: ID does not exist" Apr 17 08:06:32.768333 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.768315 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.771418 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.771399 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 08:06:32.771514 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.771402 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 08:06:32.771514 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.771432 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 08:06:32.771615 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.771566 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 08:06:32.771666 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.771625 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 08:06:32.771833 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.771819 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 08:06:32.772005 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.771990 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-enrldquh127v0\"" Apr 17 08:06:32.772187 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.772166 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 08:06:32.772303 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.772288 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 08:06:32.772608 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.772594 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-nhlc9\"" Apr 17 08:06:32.772709 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.772694 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 08:06:32.772963 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.772927 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 08:06:32.775514 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.775484 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 08:06:32.776743 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.776723 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 08:06:32.780536 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.780513 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 08:06:32.916157 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916082 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916157 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916113 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916157 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916143 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916177 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/de08b5a8-2059-493a-9593-011e696f3a52-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916195 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916246 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916264 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916313 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916369 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916350 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916581 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de08b5a8-2059-493a-9593-011e696f3a52-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916581 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916395 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-config\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916581 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916409 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916581 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916426 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de08b5a8-2059-493a-9593-011e696f3a52-config-out\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916581 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916488 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916581 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916541 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz8cj\" (UniqueName: \"kubernetes.io/projected/de08b5a8-2059-493a-9593-011e696f3a52-kube-api-access-wz8cj\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916581 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916580 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:32.916800 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:32.916597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-web-config\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.017218 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017190 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.017218 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017218 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.017381 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017238 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.017381 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017362 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/de08b5a8-2059-493a-9593-011e696f3a52-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.017451 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.017482 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017457 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.017524 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017485 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017651 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017688 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de08b5a8-2059-493a-9593-011e696f3a52-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017741 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/de08b5a8-2059-493a-9593-011e696f3a52-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017752 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-config\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017788 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017823 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de08b5a8-2059-493a-9593-011e696f3a52-config-out\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017866 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.017912 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz8cj\" (UniqueName: \"kubernetes.io/projected/de08b5a8-2059-493a-9593-011e696f3a52-kube-api-access-wz8cj\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.018008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.018042 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-web-config\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.018058 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.018068 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.018255 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.020283 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.018530 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.021117 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.020508 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.021117 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.020798 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.021117 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.020796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.021117 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.020855 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/de08b5a8-2059-493a-9593-011e696f3a52-config-out\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.021117 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.021078 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-config\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.021405 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.021250 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.021460 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.021434 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/de08b5a8-2059-493a-9593-011e696f3a52-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.021848 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.021820 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/de08b5a8-2059-493a-9593-011e696f3a52-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.023062 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.023036 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-web-config\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.023215 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.023199 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.023317 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.023302 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.023506 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.023486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/de08b5a8-2059-493a-9593-011e696f3a52-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.025909 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.025888 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz8cj\" (UniqueName: \"kubernetes.io/projected/de08b5a8-2059-493a-9593-011e696f3a52-kube-api-access-wz8cj\") pod \"prometheus-k8s-0\" (UID: \"de08b5a8-2059-493a-9593-011e696f3a52\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.077826 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.077799 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:33.200573 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.200542 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 08:06:33.202922 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:06:33.202891 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde08b5a8_2059_493a_9593_011e696f3a52.slice/crio-58cb4f69e9addbda4157a2dd4bfdc674d15f2b1f7116be2a7ef3243eae46ca8e WatchSource:0}: Error finding container 58cb4f69e9addbda4157a2dd4bfdc674d15f2b1f7116be2a7ef3243eae46ca8e: Status 404 returned error can't find the container with id 58cb4f69e9addbda4157a2dd4bfdc674d15f2b1f7116be2a7ef3243eae46ca8e Apr 17 08:06:33.712333 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.712299 2570 generic.go:358] "Generic (PLEG): container finished" podID="de08b5a8-2059-493a-9593-011e696f3a52" containerID="667eedc3153822180abbe273c1c713dafddb11fc2cd86d3f9aa961bbf548b759" exitCode=0 Apr 17 08:06:33.712756 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.712351 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"de08b5a8-2059-493a-9593-011e696f3a52","Type":"ContainerDied","Data":"667eedc3153822180abbe273c1c713dafddb11fc2cd86d3f9aa961bbf548b759"} Apr 17 08:06:33.712756 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.712395 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"de08b5a8-2059-493a-9593-011e696f3a52","Type":"ContainerStarted","Data":"58cb4f69e9addbda4157a2dd4bfdc674d15f2b1f7116be2a7ef3243eae46ca8e"} Apr 17 08:06:33.888104 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:33.888074 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fe2b7a6-23b5-4b49-8e94-86e6562f49bc" path="/var/lib/kubelet/pods/0fe2b7a6-23b5-4b49-8e94-86e6562f49bc/volumes" Apr 17 08:06:34.719176 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:34.719137 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"de08b5a8-2059-493a-9593-011e696f3a52","Type":"ContainerStarted","Data":"7fbfc2e484ef4dafca07cf7637c7d48aa000f841ec8153c3dc7bc9d2733d95be"} Apr 17 08:06:34.719176 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:34.719178 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"de08b5a8-2059-493a-9593-011e696f3a52","Type":"ContainerStarted","Data":"ba15acea13011d2d2da7a52226f424c9fcb6e364cc9313d33b46de0cac87c49c"} Apr 17 08:06:34.719596 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:34.719190 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"de08b5a8-2059-493a-9593-011e696f3a52","Type":"ContainerStarted","Data":"7c9260a8400870e9ffc543abe5701d91f35f25021e939f6b75acc29295208ea7"} Apr 17 08:06:34.719596 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:34.719199 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"de08b5a8-2059-493a-9593-011e696f3a52","Type":"ContainerStarted","Data":"78b4f14bba597f8096f2d159e35da00eed9a728cdee274d682d347012c8b1296"} Apr 17 08:06:34.719596 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:34.719207 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"de08b5a8-2059-493a-9593-011e696f3a52","Type":"ContainerStarted","Data":"1ae8b8ef52c7f7db2c748ff6155b6bfc06ce40ede077b6f00b882e618f7be007"} Apr 17 08:06:34.719596 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:34.719215 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"de08b5a8-2059-493a-9593-011e696f3a52","Type":"ContainerStarted","Data":"11e808d21d29fea24ace756440d36fcd3e8dad360e41d550cb63d97c7747b570"} Apr 17 08:06:34.749427 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:34.749361 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.749344323 podStartE2EDuration="2.749344323s" podCreationTimestamp="2026-04-17 08:06:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:06:34.747434062 +0000 UTC m=+235.504911516" watchObservedRunningTime="2026-04-17 08:06:34.749344323 +0000 UTC m=+235.506821754" Apr 17 08:06:38.078219 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:38.078188 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:06:50.678398 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:50.678360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:06:50.680680 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:50.680654 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/825bc295-b53d-4e6b-9c7e-ad30d2d38c65-metrics-certs\") pod \"network-metrics-daemon-x6js9\" (UID: \"825bc295-b53d-4e6b-9c7e-ad30d2d38c65\") " pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:06:50.686598 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:50.686573 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x2n8l\"" Apr 17 08:06:50.693994 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:50.693974 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x6js9" Apr 17 08:06:50.814695 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:50.814672 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x6js9"] Apr 17 08:06:50.817286 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:06:50.817261 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod825bc295_b53d_4e6b_9c7e_ad30d2d38c65.slice/crio-aa5436e1ed04f60e4989ef35d6c691392541188566892c3a3d28e23a79e66952 WatchSource:0}: Error finding container aa5436e1ed04f60e4989ef35d6c691392541188566892c3a3d28e23a79e66952: Status 404 returned error can't find the container with id aa5436e1ed04f60e4989ef35d6c691392541188566892c3a3d28e23a79e66952 Apr 17 08:06:51.776505 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:51.776463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x6js9" event={"ID":"825bc295-b53d-4e6b-9c7e-ad30d2d38c65","Type":"ContainerStarted","Data":"aa5436e1ed04f60e4989ef35d6c691392541188566892c3a3d28e23a79e66952"} Apr 17 08:06:52.782048 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:52.782012 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x6js9" event={"ID":"825bc295-b53d-4e6b-9c7e-ad30d2d38c65","Type":"ContainerStarted","Data":"6f01c0e052ba5ec07d581baa34d6a8a4e15a168041ab9dff03529c4d9c3fbc89"} Apr 17 08:06:52.782048 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:52.782052 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x6js9" event={"ID":"825bc295-b53d-4e6b-9c7e-ad30d2d38c65","Type":"ContainerStarted","Data":"bc459775d533d0b13e05c20c866eee0bb9fe0030e6f7a1f1c25214599973af71"} Apr 17 08:06:52.801147 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:06:52.801099 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x6js9" podStartSLOduration=251.853690518 podStartE2EDuration="4m12.801086191s" podCreationTimestamp="2026-04-17 08:02:40 +0000 UTC" firstStartedPulling="2026-04-17 08:06:50.819586526 +0000 UTC m=+251.577063932" lastFinishedPulling="2026-04-17 08:06:51.766982185 +0000 UTC m=+252.524459605" observedRunningTime="2026-04-17 08:06:52.799312127 +0000 UTC m=+253.556789555" watchObservedRunningTime="2026-04-17 08:06:52.801086191 +0000 UTC m=+253.558563619" Apr 17 08:07:33.078254 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:07:33.078215 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:07:33.093776 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:07:33.093747 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:07:33.928418 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:07:33.928385 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 08:07:39.798853 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:07:39.798822 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovn-acl-logging/0.log" Apr 17 08:07:39.798853 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:07:39.798840 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovn-acl-logging/0.log" Apr 17 08:07:39.801828 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:07:39.801809 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 08:08:09.165350 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.165313 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-9mg42"] Apr 17 08:08:09.168985 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.168969 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" Apr 17 08:08:09.171656 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.171634 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 08:08:09.171795 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.171690 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 08:08:09.173028 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.173010 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-ghnqm\"" Apr 17 08:08:09.177021 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.176999 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-9mg42"] Apr 17 08:08:09.245772 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.245744 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7da703fd-8ded-45d3-bc2a-31634725a719-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-9mg42\" (UID: \"7da703fd-8ded-45d3-bc2a-31634725a719\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" Apr 17 08:08:09.245908 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.245785 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4bf\" (UniqueName: \"kubernetes.io/projected/7da703fd-8ded-45d3-bc2a-31634725a719-kube-api-access-nl4bf\") pod \"cert-manager-cainjector-8966b78d4-9mg42\" (UID: \"7da703fd-8ded-45d3-bc2a-31634725a719\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" Apr 17 08:08:09.346267 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.346237 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7da703fd-8ded-45d3-bc2a-31634725a719-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-9mg42\" (UID: \"7da703fd-8ded-45d3-bc2a-31634725a719\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" Apr 17 08:08:09.346413 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.346278 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4bf\" (UniqueName: \"kubernetes.io/projected/7da703fd-8ded-45d3-bc2a-31634725a719-kube-api-access-nl4bf\") pod \"cert-manager-cainjector-8966b78d4-9mg42\" (UID: \"7da703fd-8ded-45d3-bc2a-31634725a719\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" Apr 17 08:08:09.356600 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.356566 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7da703fd-8ded-45d3-bc2a-31634725a719-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-9mg42\" (UID: \"7da703fd-8ded-45d3-bc2a-31634725a719\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" Apr 17 08:08:09.356600 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.356589 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4bf\" (UniqueName: \"kubernetes.io/projected/7da703fd-8ded-45d3-bc2a-31634725a719-kube-api-access-nl4bf\") pod \"cert-manager-cainjector-8966b78d4-9mg42\" (UID: \"7da703fd-8ded-45d3-bc2a-31634725a719\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" Apr 17 08:08:09.492776 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.492704 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" Apr 17 08:08:09.609544 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.609493 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-9mg42"] Apr 17 08:08:09.611868 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:08:09.611842 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7da703fd_8ded_45d3_bc2a_31634725a719.slice/crio-9e734df32c046a3aa47c9e9693468a5d6134845b9c6badf073fd3f2d4607cc31 WatchSource:0}: Error finding container 9e734df32c046a3aa47c9e9693468a5d6134845b9c6badf073fd3f2d4607cc31: Status 404 returned error can't find the container with id 9e734df32c046a3aa47c9e9693468a5d6134845b9c6badf073fd3f2d4607cc31 Apr 17 08:08:09.613603 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:09.613583 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 08:08:10.021186 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:10.021147 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" event={"ID":"7da703fd-8ded-45d3-bc2a-31634725a719","Type":"ContainerStarted","Data":"9e734df32c046a3aa47c9e9693468a5d6134845b9c6badf073fd3f2d4607cc31"} Apr 17 08:08:13.033823 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:13.033785 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" event={"ID":"7da703fd-8ded-45d3-bc2a-31634725a719","Type":"ContainerStarted","Data":"e48c351bd5b52bb9b493bb2716c44b5327bdd7d27641ef257762ae18cdf6c448"} Apr 17 08:08:13.051004 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:08:13.050949 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-9mg42" podStartSLOduration=0.845616135 podStartE2EDuration="4.050921019s" podCreationTimestamp="2026-04-17 08:08:09 +0000 UTC" firstStartedPulling="2026-04-17 08:08:09.613775908 +0000 UTC m=+330.371253330" lastFinishedPulling="2026-04-17 08:08:12.819080797 +0000 UTC m=+333.576558214" observedRunningTime="2026-04-17 08:08:13.048713404 +0000 UTC m=+333.806190842" watchObservedRunningTime="2026-04-17 08:08:13.050921019 +0000 UTC m=+333.808398447" Apr 17 08:09:21.834665 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:21.834627 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-njcq8/must-gather-6445b"] Apr 17 08:09:21.837016 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:21.837000 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:09:21.839758 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:21.839734 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-njcq8\"/\"kube-root-ca.crt\"" Apr 17 08:09:21.839877 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:21.839766 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-njcq8\"/\"openshift-service-ca.crt\"" Apr 17 08:09:21.841061 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:21.841044 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-njcq8\"/\"default-dockercfg-z5sbb\"" Apr 17 08:09:21.844169 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:21.844150 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-njcq8/must-gather-6445b"] Apr 17 08:09:21.947699 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:21.947660 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9b7p\" (UniqueName: \"kubernetes.io/projected/ab2a338a-d163-4d8f-af47-7a635300fe04-kube-api-access-g9b7p\") pod \"must-gather-6445b\" (UID: \"ab2a338a-d163-4d8f-af47-7a635300fe04\") " pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:09:21.947859 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:21.947713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab2a338a-d163-4d8f-af47-7a635300fe04-must-gather-output\") pod \"must-gather-6445b\" (UID: \"ab2a338a-d163-4d8f-af47-7a635300fe04\") " pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:09:22.048763 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:22.048731 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9b7p\" (UniqueName: \"kubernetes.io/projected/ab2a338a-d163-4d8f-af47-7a635300fe04-kube-api-access-g9b7p\") pod \"must-gather-6445b\" (UID: \"ab2a338a-d163-4d8f-af47-7a635300fe04\") " pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:09:22.048968 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:22.048772 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab2a338a-d163-4d8f-af47-7a635300fe04-must-gather-output\") pod \"must-gather-6445b\" (UID: \"ab2a338a-d163-4d8f-af47-7a635300fe04\") " pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:09:22.049113 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:22.049095 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab2a338a-d163-4d8f-af47-7a635300fe04-must-gather-output\") pod \"must-gather-6445b\" (UID: \"ab2a338a-d163-4d8f-af47-7a635300fe04\") " pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:09:22.056779 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:22.056751 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9b7p\" (UniqueName: \"kubernetes.io/projected/ab2a338a-d163-4d8f-af47-7a635300fe04-kube-api-access-g9b7p\") pod \"must-gather-6445b\" (UID: \"ab2a338a-d163-4d8f-af47-7a635300fe04\") " pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:09:22.146761 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:22.146644 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:09:22.267374 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:22.267351 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-njcq8/must-gather-6445b"] Apr 17 08:09:22.269072 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:09:22.269038 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab2a338a_d163_4d8f_af47_7a635300fe04.slice/crio-ad76057bb917fb59d3717046a958eea5a50ee18a0cb1c710afb8fac0934719f6 WatchSource:0}: Error finding container ad76057bb917fb59d3717046a958eea5a50ee18a0cb1c710afb8fac0934719f6: Status 404 returned error can't find the container with id ad76057bb917fb59d3717046a958eea5a50ee18a0cb1c710afb8fac0934719f6 Apr 17 08:09:23.245777 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:23.245740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-njcq8/must-gather-6445b" event={"ID":"ab2a338a-d163-4d8f-af47-7a635300fe04","Type":"ContainerStarted","Data":"ad76057bb917fb59d3717046a958eea5a50ee18a0cb1c710afb8fac0934719f6"} Apr 17 08:09:28.264561 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:28.264524 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-njcq8/must-gather-6445b" event={"ID":"ab2a338a-d163-4d8f-af47-7a635300fe04","Type":"ContainerStarted","Data":"ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94"} Apr 17 08:09:28.265168 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:28.264567 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-njcq8/must-gather-6445b" event={"ID":"ab2a338a-d163-4d8f-af47-7a635300fe04","Type":"ContainerStarted","Data":"014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9"} Apr 17 08:09:28.281171 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:09:28.281117 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-njcq8/must-gather-6445b" podStartSLOduration=2.163879696 podStartE2EDuration="7.281102103s" podCreationTimestamp="2026-04-17 08:09:21 +0000 UTC" firstStartedPulling="2026-04-17 08:09:22.27091238 +0000 UTC m=+403.028389786" lastFinishedPulling="2026-04-17 08:09:27.388134783 +0000 UTC m=+408.145612193" observedRunningTime="2026-04-17 08:09:28.279411234 +0000 UTC m=+409.036888663" watchObservedRunningTime="2026-04-17 08:09:28.281102103 +0000 UTC m=+409.038579530" Apr 17 08:10:12.420131 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:12.420095 2570 generic.go:358] "Generic (PLEG): container finished" podID="ab2a338a-d163-4d8f-af47-7a635300fe04" containerID="014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9" exitCode=0 Apr 17 08:10:12.420551 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:12.420166 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-njcq8/must-gather-6445b" event={"ID":"ab2a338a-d163-4d8f-af47-7a635300fe04","Type":"ContainerDied","Data":"014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9"} Apr 17 08:10:12.420551 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:12.420485 2570 scope.go:117] "RemoveContainer" containerID="014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9" Apr 17 08:10:13.354984 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:13.354950 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-njcq8_must-gather-6445b_ab2a338a-d163-4d8f-af47-7a635300fe04/gather/0.log" Apr 17 08:10:16.528387 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:16.528358 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6cfjr_12c8f408-58f4-4cc4-a90f-967f072165d2/global-pull-secret-syncer/0.log" Apr 17 08:10:16.648528 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:16.648491 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7jkx8_ba1fb94a-02fe-4c28-9c64-63dbb3a0662a/konnectivity-agent/0.log" Apr 17 08:10:16.769003 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:16.768971 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-63.ec2.internal_19617c61026894db47a634e0cbb16491/haproxy/0.log" Apr 17 08:10:18.678174 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.678114 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-njcq8/must-gather-6445b"] Apr 17 08:10:18.678646 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.678419 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-njcq8/must-gather-6445b" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" containerName="copy" containerID="cri-o://ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94" gracePeriod=2 Apr 17 08:10:18.680516 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.680493 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-njcq8/must-gather-6445b"] Apr 17 08:10:18.681026 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.681003 2570 status_manager.go:895] "Failed to get status for pod" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" pod="openshift-must-gather-njcq8/must-gather-6445b" err="pods \"must-gather-6445b\" is forbidden: User \"system:node:ip-10-0-138-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-njcq8\": no relationship found between node 'ip-10-0-138-63.ec2.internal' and this object" Apr 17 08:10:18.912209 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.912185 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-njcq8_must-gather-6445b_ab2a338a-d163-4d8f-af47-7a635300fe04/copy/0.log" Apr 17 08:10:18.912547 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.912531 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:10:18.914729 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.914700 2570 status_manager.go:895] "Failed to get status for pod" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" pod="openshift-must-gather-njcq8/must-gather-6445b" err="pods \"must-gather-6445b\" is forbidden: User \"system:node:ip-10-0-138-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-njcq8\": no relationship found between node 'ip-10-0-138-63.ec2.internal' and this object" Apr 17 08:10:18.972311 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.972248 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab2a338a-d163-4d8f-af47-7a635300fe04-must-gather-output\") pod \"ab2a338a-d163-4d8f-af47-7a635300fe04\" (UID: \"ab2a338a-d163-4d8f-af47-7a635300fe04\") " Apr 17 08:10:18.972428 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.972327 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9b7p\" (UniqueName: \"kubernetes.io/projected/ab2a338a-d163-4d8f-af47-7a635300fe04-kube-api-access-g9b7p\") pod \"ab2a338a-d163-4d8f-af47-7a635300fe04\" (UID: \"ab2a338a-d163-4d8f-af47-7a635300fe04\") " Apr 17 08:10:18.974344 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.974313 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab2a338a-d163-4d8f-af47-7a635300fe04-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ab2a338a-d163-4d8f-af47-7a635300fe04" (UID: "ab2a338a-d163-4d8f-af47-7a635300fe04"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 08:10:18.974433 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:18.974342 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab2a338a-d163-4d8f-af47-7a635300fe04-kube-api-access-g9b7p" (OuterVolumeSpecName: "kube-api-access-g9b7p") pod "ab2a338a-d163-4d8f-af47-7a635300fe04" (UID: "ab2a338a-d163-4d8f-af47-7a635300fe04"). InnerVolumeSpecName "kube-api-access-g9b7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 08:10:19.073826 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.073798 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ab2a338a-d163-4d8f-af47-7a635300fe04-must-gather-output\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:10:19.073826 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.073825 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9b7p\" (UniqueName: \"kubernetes.io/projected/ab2a338a-d163-4d8f-af47-7a635300fe04-kube-api-access-g9b7p\") on node \"ip-10-0-138-63.ec2.internal\" DevicePath \"\"" Apr 17 08:10:19.443370 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.443343 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-njcq8_must-gather-6445b_ab2a338a-d163-4d8f-af47-7a635300fe04/copy/0.log" Apr 17 08:10:19.443659 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.443636 2570 generic.go:358] "Generic (PLEG): container finished" podID="ab2a338a-d163-4d8f-af47-7a635300fe04" containerID="ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94" exitCode=143 Apr 17 08:10:19.443746 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.443681 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-njcq8/must-gather-6445b" Apr 17 08:10:19.443746 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.443731 2570 scope.go:117] "RemoveContainer" containerID="ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94" Apr 17 08:10:19.446166 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.446135 2570 status_manager.go:895] "Failed to get status for pod" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" pod="openshift-must-gather-njcq8/must-gather-6445b" err="pods \"must-gather-6445b\" is forbidden: User \"system:node:ip-10-0-138-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-njcq8\": no relationship found between node 'ip-10-0-138-63.ec2.internal' and this object" Apr 17 08:10:19.451978 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.451958 2570 scope.go:117] "RemoveContainer" containerID="014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9" Apr 17 08:10:19.454204 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.454179 2570 status_manager.go:895] "Failed to get status for pod" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" pod="openshift-must-gather-njcq8/must-gather-6445b" err="pods \"must-gather-6445b\" is forbidden: User \"system:node:ip-10-0-138-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-njcq8\": no relationship found between node 'ip-10-0-138-63.ec2.internal' and this object" Apr 17 08:10:19.463527 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.463503 2570 scope.go:117] "RemoveContainer" containerID="ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94" Apr 17 08:10:19.463775 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:10:19.463754 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94\": container with ID starting with ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94 not found: ID does not exist" containerID="ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94" Apr 17 08:10:19.463839 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.463786 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94"} err="failed to get container status \"ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94\": rpc error: code = NotFound desc = could not find container \"ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94\": container with ID starting with ecdf89439ad461a336a837c0844a72bd17463fd9e0ff2454f6cc01939ed76e94 not found: ID does not exist" Apr 17 08:10:19.463839 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.463811 2570 scope.go:117] "RemoveContainer" containerID="014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9" Apr 17 08:10:19.464067 ip-10-0-138-63 kubenswrapper[2570]: E0417 08:10:19.464051 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9\": container with ID starting with 014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9 not found: ID does not exist" containerID="014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9" Apr 17 08:10:19.464113 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.464072 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9"} err="failed to get container status \"014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9\": rpc error: code = NotFound desc = could not find container \"014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9\": container with ID starting with 014baa1c419581feae1da5cbf13f0062ee2d627437bdbc8f34e83c2980ee30c9 not found: ID does not exist" Apr 17 08:10:19.655851 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.655824 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c34aa-650e-46d7-9669-07dd9872db26/alertmanager/0.log" Apr 17 08:10:19.693126 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.693097 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c34aa-650e-46d7-9669-07dd9872db26/config-reloader/0.log" Apr 17 08:10:19.719742 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.719679 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c34aa-650e-46d7-9669-07dd9872db26/kube-rbac-proxy-web/0.log" Apr 17 08:10:19.750920 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.750896 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c34aa-650e-46d7-9669-07dd9872db26/kube-rbac-proxy/0.log" Apr 17 08:10:19.782268 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.782244 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c34aa-650e-46d7-9669-07dd9872db26/kube-rbac-proxy-metric/0.log" Apr 17 08:10:19.812376 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.812351 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c34aa-650e-46d7-9669-07dd9872db26/prom-label-proxy/0.log" Apr 17 08:10:19.848033 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.848010 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c30c34aa-650e-46d7-9669-07dd9872db26/init-config-reloader/0.log" Apr 17 08:10:19.887400 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.887366 2570 status_manager.go:895] "Failed to get status for pod" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" pod="openshift-must-gather-njcq8/must-gather-6445b" err="pods \"must-gather-6445b\" is forbidden: User \"system:node:ip-10-0-138-63.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-njcq8\": no relationship found between node 'ip-10-0-138-63.ec2.internal' and this object" Apr 17 08:10:19.888185 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.888163 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" path="/var/lib/kubelet/pods/ab2a338a-d163-4d8f-af47-7a635300fe04/volumes" Apr 17 08:10:19.889157 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.889141 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-f85s5_6c639c32-2f50-4f3f-9ba1-c99215cb7e01/cluster-monitoring-operator/0.log" Apr 17 08:10:19.917810 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.917787 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-889nc_76d99ffc-5745-4855-aad4-1f77be2a9f22/kube-state-metrics/0.log" Apr 17 08:10:19.941895 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.941849 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-889nc_76d99ffc-5745-4855-aad4-1f77be2a9f22/kube-rbac-proxy-main/0.log" Apr 17 08:10:19.965088 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:19.965059 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-889nc_76d99ffc-5745-4855-aad4-1f77be2a9f22/kube-rbac-proxy-self/0.log" Apr 17 08:10:20.003295 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.003237 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-55b96554df-2z2jz_b52aae27-b89c-4591-ae6b-d324992aef0c/metrics-server/0.log" Apr 17 08:10:20.218842 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.218792 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zgs9j_9ca23c5d-aff9-4250-952e-3fe91b19a469/node-exporter/0.log" Apr 17 08:10:20.241013 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.240959 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zgs9j_9ca23c5d-aff9-4250-952e-3fe91b19a469/kube-rbac-proxy/0.log" Apr 17 08:10:20.262038 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.262015 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zgs9j_9ca23c5d-aff9-4250-952e-3fe91b19a469/init-textfile/0.log" Apr 17 08:10:20.290314 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.290287 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gmhvh_c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1/kube-rbac-proxy-main/0.log" Apr 17 08:10:20.313245 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.313222 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gmhvh_c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1/kube-rbac-proxy-self/0.log" Apr 17 08:10:20.336282 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.336255 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gmhvh_c278dd7a-99b8-425a-b7a7-f5cdcd2e0ce1/openshift-state-metrics/0.log" Apr 17 08:10:20.378053 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.378025 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_de08b5a8-2059-493a-9593-011e696f3a52/prometheus/0.log" Apr 17 08:10:20.402211 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.402173 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_de08b5a8-2059-493a-9593-011e696f3a52/config-reloader/0.log" Apr 17 08:10:20.427446 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.427422 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_de08b5a8-2059-493a-9593-011e696f3a52/thanos-sidecar/0.log" Apr 17 08:10:20.450326 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.450304 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_de08b5a8-2059-493a-9593-011e696f3a52/kube-rbac-proxy-web/0.log" Apr 17 08:10:20.471802 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.471776 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_de08b5a8-2059-493a-9593-011e696f3a52/kube-rbac-proxy/0.log" Apr 17 08:10:20.497107 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.497084 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_de08b5a8-2059-493a-9593-011e696f3a52/kube-rbac-proxy-thanos/0.log" Apr 17 08:10:20.518294 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.518234 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_de08b5a8-2059-493a-9593-011e696f3a52/init-config-reloader/0.log" Apr 17 08:10:20.592553 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.592525 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-rn7l6_aa7fa9f7-d17e-4534-91e2-b7f035da1e65/prometheus-operator-admission-webhook/0.log" Apr 17 08:10:20.625393 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.625369 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9849b9fd7-vhprn_fe2a37d6-7621-4fe0-933f-ddad7873f146/telemeter-client/0.log" Apr 17 08:10:20.647351 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.647328 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9849b9fd7-vhprn_fe2a37d6-7621-4fe0-933f-ddad7873f146/reload/0.log" Apr 17 08:10:20.674024 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:20.674001 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9849b9fd7-vhprn_fe2a37d6-7621-4fe0-933f-ddad7873f146/kube-rbac-proxy/0.log" Apr 17 08:10:21.978643 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:21.978562 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-zjl9j_fee27ff2-c1bc-4b4a-ab6a-a22844376a8f/networking-console-plugin/0.log" Apr 17 08:10:23.122566 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.122537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-vwc9p_ba78d13c-1ebd-4761-b661-3cc6591106b7/volume-data-source-validator/0.log" Apr 17 08:10:23.456527 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.456447 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7"] Apr 17 08:10:23.456802 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.456789 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" containerName="copy" Apr 17 08:10:23.456843 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.456804 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" containerName="copy" Apr 17 08:10:23.456843 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.456817 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" containerName="gather" Apr 17 08:10:23.456843 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.456823 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" containerName="gather" Apr 17 08:10:23.456954 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.456884 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" containerName="gather" Apr 17 08:10:23.456954 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.456897 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab2a338a-d163-4d8f-af47-7a635300fe04" containerName="copy" Apr 17 08:10:23.459506 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.459487 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.462174 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.462150 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-btznw\"/\"kube-root-ca.crt\"" Apr 17 08:10:23.463598 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.463574 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-btznw\"/\"openshift-service-ca.crt\"" Apr 17 08:10:23.463799 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.463596 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-btznw\"/\"default-dockercfg-z96wr\"" Apr 17 08:10:23.466237 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.466149 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7"] Apr 17 08:10:23.513434 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.513408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-sys\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.513583 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.513503 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-lib-modules\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.513583 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.513564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mbvr\" (UniqueName: \"kubernetes.io/projected/06f92b80-29ca-4b28-b951-7572a66eba75-kube-api-access-8mbvr\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.513704 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.513594 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-podres\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.513704 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.513640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-proc\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.614732 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.614700 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-lib-modules\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.614882 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.614742 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mbvr\" (UniqueName: \"kubernetes.io/projected/06f92b80-29ca-4b28-b951-7572a66eba75-kube-api-access-8mbvr\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.614882 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.614762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-podres\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.614882 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.614793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-proc\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.614882 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.614839 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-sys\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.614882 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.614865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-proc\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.615151 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.614886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-podres\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.615151 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.614865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-lib-modules\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.615151 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.614917 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06f92b80-29ca-4b28-b951-7572a66eba75-sys\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.623284 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.623264 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mbvr\" (UniqueName: \"kubernetes.io/projected/06f92b80-29ca-4b28-b951-7572a66eba75-kube-api-access-8mbvr\") pod \"perf-node-gather-daemonset-q7zj7\" (UID: \"06f92b80-29ca-4b28-b951-7572a66eba75\") " pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.762814 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.762782 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-252zk_421db932-74ef-4855-b174-a7ce6bca201b/dns/0.log" Apr 17 08:10:23.769513 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.769492 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:23.783744 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.783720 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-252zk_421db932-74ef-4855-b174-a7ce6bca201b/kube-rbac-proxy/0.log" Apr 17 08:10:23.887536 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.887492 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7"] Apr 17 08:10:23.893213 ip-10-0-138-63 kubenswrapper[2570]: W0417 08:10:23.893184 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod06f92b80_29ca_4b28_b951_7572a66eba75.slice/crio-0b85a57a72e58a1f1043f6e525c9ed2fed87fbafef144e71b28ffc447b9ec71c WatchSource:0}: Error finding container 0b85a57a72e58a1f1043f6e525c9ed2fed87fbafef144e71b28ffc447b9ec71c: Status 404 returned error can't find the container with id 0b85a57a72e58a1f1043f6e525c9ed2fed87fbafef144e71b28ffc447b9ec71c Apr 17 08:10:23.907560 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:23.907537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cqc8h_e63f1d9b-13b7-4099-ad63-64b33b70f697/dns-node-resolver/0.log" Apr 17 08:10:24.394260 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:24.394156 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-skwnw_2c3085fe-841e-4ff9-aa63-90a0b035c240/node-ca/0.log" Apr 17 08:10:24.461576 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:24.461543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" event={"ID":"06f92b80-29ca-4b28-b951-7572a66eba75","Type":"ContainerStarted","Data":"0ca9e678e5ab5f70361990118087cd773849ef05dc5c36b51ee0294338fdb9ab"} Apr 17 08:10:24.461576 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:24.461580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" event={"ID":"06f92b80-29ca-4b28-b951-7572a66eba75","Type":"ContainerStarted","Data":"0b85a57a72e58a1f1043f6e525c9ed2fed87fbafef144e71b28ffc447b9ec71c"} Apr 17 08:10:24.461782 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:24.461700 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:24.477890 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:24.477848 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" podStartSLOduration=1.477836094 podStartE2EDuration="1.477836094s" podCreationTimestamp="2026-04-17 08:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 08:10:24.47738848 +0000 UTC m=+465.234865908" watchObservedRunningTime="2026-04-17 08:10:24.477836094 +0000 UTC m=+465.235313522" Apr 17 08:10:25.403120 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:25.403089 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wp9g5_116a85c5-54d8-4462-9305-b1de37bca8cf/serve-healthcheck-canary/0.log" Apr 17 08:10:25.718464 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:25.718363 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-vwznj_29f0e625-d1d8-412e-8a62-f3d9c9c33c3e/insights-operator/0.log" Apr 17 08:10:25.718697 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:25.718647 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-vwznj_29f0e625-d1d8-412e-8a62-f3d9c9c33c3e/insights-operator/1.log" Apr 17 08:10:25.738156 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:25.738130 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4pltb_febe2ce9-02a6-467c-836e-72a352ffead8/kube-rbac-proxy/0.log" Apr 17 08:10:25.760105 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:25.760086 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4pltb_febe2ce9-02a6-467c-836e-72a352ffead8/exporter/0.log" Apr 17 08:10:25.781436 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:25.781393 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4pltb_febe2ce9-02a6-467c-836e-72a352ffead8/extractor/0.log" Apr 17 08:10:30.412754 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:30.412712 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-55t9m_48f0adf1-9c65-4866-b146-76db151d34d3/migrator/0.log" Apr 17 08:10:30.432815 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:30.432795 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-55t9m_48f0adf1-9c65-4866-b146-76db151d34d3/graceful-termination/0.log" Apr 17 08:10:30.474666 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:30.474635 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-btznw/perf-node-gather-daemonset-q7zj7" Apr 17 08:10:31.912704 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:31.912677 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xc6x7_26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1/kube-multus-additional-cni-plugins/0.log" Apr 17 08:10:31.936195 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:31.936149 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xc6x7_26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1/egress-router-binary-copy/0.log" Apr 17 08:10:31.957468 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:31.957445 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xc6x7_26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1/cni-plugins/0.log" Apr 17 08:10:31.979029 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:31.979012 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xc6x7_26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1/bond-cni-plugin/0.log" Apr 17 08:10:32.002638 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:32.002611 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xc6x7_26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1/routeoverride-cni/0.log" Apr 17 08:10:32.024856 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:32.024830 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xc6x7_26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1/whereabouts-cni-bincopy/0.log" Apr 17 08:10:32.047723 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:32.047692 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xc6x7_26d3cacc-3bac-4f3c-9676-7ce9a23d4ae1/whereabouts-cni/0.log" Apr 17 08:10:32.106495 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:32.106468 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-krs9k_6b843f47-d57d-4596-961a-205314dbf0f8/kube-multus/0.log" Apr 17 08:10:32.262786 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:32.262750 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x6js9_825bc295-b53d-4e6b-9c7e-ad30d2d38c65/network-metrics-daemon/0.log" Apr 17 08:10:32.284774 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:32.284746 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x6js9_825bc295-b53d-4e6b-9c7e-ad30d2d38c65/kube-rbac-proxy/0.log" Apr 17 08:10:33.088086 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:33.088057 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovn-controller/0.log" Apr 17 08:10:33.107123 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:33.107100 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovn-acl-logging/0.log" Apr 17 08:10:33.109276 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:33.109254 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovn-acl-logging/1.log" Apr 17 08:10:33.128227 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:33.128206 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/kube-rbac-proxy-node/0.log" Apr 17 08:10:33.150432 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:33.150411 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 08:10:33.172838 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:33.172818 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/northd/0.log" Apr 17 08:10:33.201833 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:33.201813 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/nbdb/0.log" Apr 17 08:10:33.224588 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:33.224567 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/sbdb/0.log" Apr 17 08:10:33.321395 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:33.321368 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nt4p9_832735dc-0dda-465b-96fe-56bb39f2a72b/ovnkube-controller/0.log" Apr 17 08:10:34.945375 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:34.945346 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-rkbs5_36ae9c9b-1417-4e3d-8f1a-e54cbe63c9dd/network-check-target-container/0.log" Apr 17 08:10:35.780559 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:35.780525 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-sd8ch_aa9f1d02-c04d-4591-a2d3-aa61e92869ba/iptables-alerter/0.log" Apr 17 08:10:36.343581 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:36.343553 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-2f72z_0e200b23-a648-49b4-9ee6-7b0e5ceaba16/tuned/0.log" Apr 17 08:10:37.938612 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:37.938581 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2z86d_daaeed21-ac53-4784-abb1-fc080fe469a9/cluster-samples-operator/0.log" Apr 17 08:10:37.965199 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:37.965171 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-2z86d_daaeed21-ac53-4784-abb1-fc080fe469a9/cluster-samples-operator-watch/0.log" Apr 17 08:10:38.831114 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:38.831084 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-csrrb_c9d36857-0992-41ad-aa34-4e41c08ace48/service-ca-operator/1.log" Apr 17 08:10:38.831900 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:38.831880 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-csrrb_c9d36857-0992-41ad-aa34-4e41c08ace48/service-ca-operator/0.log" Apr 17 08:10:39.594949 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:39.594920 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9tl68_d5a576c4-fd46-48a0-9584-c6849f6fca38/csi-driver/0.log" Apr 17 08:10:39.616217 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:39.616194 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9tl68_d5a576c4-fd46-48a0-9584-c6849f6fca38/csi-node-driver-registrar/0.log" Apr 17 08:10:39.638543 ip-10-0-138-63 kubenswrapper[2570]: I0417 08:10:39.638521 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9tl68_d5a576c4-fd46-48a0-9584-c6849f6fca38/csi-liveness-probe/0.log"