Apr 22 15:33:50.415167 ip-10-0-143-30 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:33:50.848391 ip-10-0-143-30 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:33:50.848391 ip-10-0-143-30 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:33:50.848391 ip-10-0-143-30 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:33:50.848391 ip-10-0-143-30 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:33:50.848391 ip-10-0-143-30 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:33:50.850358 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.850289 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:33:50.855583 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855568 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:50.855583 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855584 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855588 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855591 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855594 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855597 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855600 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855603 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855605 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855608 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855611 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855613 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855616 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855618 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855621 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855624 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855626 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855629 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855632 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855634 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855637 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:50.855658 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855639 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855642 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855645 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855649 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855652 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855655 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855658 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855661 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855665 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855668 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855671 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855673 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855676 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855679 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855681 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855685 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855690 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855693 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855696 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:50.856139 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855699 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855701 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855704 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855706 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855709 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855711 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855714 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855716 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855718 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855721 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855724 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855726 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855729 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855731 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855734 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855737 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855740 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855743 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855745 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855748 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:50.856611 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855750 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855753 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855755 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855758 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855760 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855763 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855765 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855768 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855770 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855772 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855775 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855777 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855786 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855789 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855791 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855795 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855797 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855800 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855802 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:50.857158 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855805 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855807 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855809 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855812 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855815 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855818 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.855820 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856247 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856253 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856256 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856259 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856262 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856264 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856267 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856269 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856272 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856274 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856277 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856279 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:50.857604 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856282 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856285 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856287 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856289 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856292 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856294 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856302 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856305 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856308 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856310 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856313 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856316 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856319 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856321 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856324 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856327 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856331 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856335 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856338 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:50.858073 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856340 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856343 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856345 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856348 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856350 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856353 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856355 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856358 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856360 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856362 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856365 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856367 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856370 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856372 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856375 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856377 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856380 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856382 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856385 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:50.858551 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856387 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856395 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856397 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856400 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856402 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856405 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856407 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856410 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856412 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856415 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856417 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856420 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856422 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856425 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856427 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856430 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856432 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856434 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856437 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856440 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:50.859031 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856442 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856445 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856447 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856450 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856452 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856454 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856457 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856459 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856462 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856464 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856469 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856473 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856476 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856479 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856487 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.856490 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856571 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856578 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856592 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856596 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856601 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:33:50.859523 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856604 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856608 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856612 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856615 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856618 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856622 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856625 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856629 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856631 2577 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856634 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856637 2577 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856639 2577 flags.go:64] FLAG: --cloud-config="" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856642 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856645 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856651 2577 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856653 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856656 2577 flags.go:64] FLAG: --config-dir="" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856659 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856662 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856666 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856669 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856675 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856678 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856681 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:33:50.860043 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856684 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856686 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856695 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856698 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856702 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856705 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856709 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856712 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856715 2577 flags.go:64] FLAG: --enable-server="true" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856718 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856725 2577 flags.go:64] FLAG: --event-burst="100" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856728 2577 flags.go:64] FLAG: --event-qps="50" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856730 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856733 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856736 2577 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856740 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856742 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856745 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856748 2577 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856751 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856754 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856757 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856759 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856762 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856765 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:33:50.860610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856768 2577 flags.go:64] FLAG: --feature-gates="" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856771 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856774 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856777 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856781 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856784 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856787 2577 flags.go:64] FLAG: --help="false" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856790 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-143-30.ec2.internal" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856793 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856795 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856803 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856807 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856810 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856814 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856817 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856819 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856822 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856825 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856828 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856831 2577 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856833 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856836 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856839 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856842 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:33:50.861251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856845 2577 flags.go:64] FLAG: --lock-file="" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856847 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856850 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856853 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856858 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856861 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856863 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856866 2577 flags.go:64] FLAG: --logging-format="text" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856869 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856872 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856875 2577 flags.go:64] FLAG: --manifest-url="" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856877 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856882 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856885 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856889 2577 flags.go:64] FLAG: --max-pods="110" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856892 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856909 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856912 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856915 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856924 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856927 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856931 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856938 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856941 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856944 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:33:50.861863 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856947 2577 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856949 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856955 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856958 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856961 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856964 2577 flags.go:64] FLAG: --port="10250" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856967 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856970 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04dbfe6ead10dcc1c" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856973 2577 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856975 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856979 2577 flags.go:64] FLAG: --register-node="true" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856981 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856984 2577 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856988 2577 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856990 2577 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856993 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856996 2577 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.856999 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857002 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857005 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857008 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857011 2577 flags.go:64] FLAG: --runonce="false" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857014 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857017 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857020 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:33:50.862488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857022 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857025 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857034 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857038 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857041 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857044 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857046 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857049 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857052 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857055 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857058 2577 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857061 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857066 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857069 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857071 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857078 2577 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857081 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857083 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857086 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857088 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857091 2577 flags.go:64] FLAG: --v="2" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857095 2577 flags.go:64] FLAG: --version="false" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857099 2577 flags.go:64] FLAG: --vmodule="" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857103 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.857106 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:33:50.863144 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857225 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857229 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857235 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857239 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857248 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857251 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857254 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857257 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857260 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857263 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857273 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857276 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857278 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857281 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857284 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857286 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857288 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857291 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857294 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:50.863734 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857296 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857299 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857301 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857303 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857306 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857310 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857313 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857316 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857318 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857321 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857323 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857325 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857328 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857330 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857333 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857336 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857339 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857341 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857344 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857346 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:50.864207 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857349 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857351 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857354 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857357 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857365 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857368 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857370 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857373 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857375 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857377 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857380 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857382 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857385 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857387 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857390 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857392 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857395 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857398 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857400 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857403 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:50.864710 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857405 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857407 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857410 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857412 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857415 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857417 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857420 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857423 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857425 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857428 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857431 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857433 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857435 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857438 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857440 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857444 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857447 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857454 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857457 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857459 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:50.865205 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857461 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857464 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857466 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857469 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857471 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857474 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.857476 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.858051 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.864068 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.864081 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864126 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864131 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864135 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864138 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864141 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864143 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:50.865715 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864146 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864149 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864151 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864154 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864157 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864160 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864163 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864165 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864168 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864170 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864173 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864176 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864179 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864182 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864184 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864187 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864189 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864192 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864194 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864197 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:50.866137 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864199 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864203 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864206 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864209 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864212 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864215 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864218 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864220 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864223 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864225 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864228 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864230 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864233 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864236 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864238 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864241 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864244 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864247 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864249 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:50.866622 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864252 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864255 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864257 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864259 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864262 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864264 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864267 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864269 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864272 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864274 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864276 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864279 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864281 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864284 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864286 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864288 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864291 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864293 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864296 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864299 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:50.867089 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864301 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864303 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864306 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864308 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864311 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864313 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864316 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864318 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864321 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864323 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864326 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864328 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864332 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864337 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864340 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864343 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864345 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864348 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864351 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:50.867572 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864353 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864356 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.864361 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864452 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864456 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864459 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864462 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864465 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864468 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864470 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864473 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864475 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864478 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864481 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864484 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864486 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:33:50.868117 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864489 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864491 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864494 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864496 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864499 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864501 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864504 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864506 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864509 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864511 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864514 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864517 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864519 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864522 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864524 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864527 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864529 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864532 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864534 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:33:50.868504 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864536 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864540 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864543 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864546 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864549 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864552 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864555 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864557 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864560 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864563 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864565 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864568 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864570 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864573 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864575 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864577 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864580 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864583 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864586 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864588 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:33:50.869185 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864590 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864593 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864595 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864598 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864600 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864602 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864605 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864607 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864610 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864612 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864615 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864617 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864619 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864622 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864624 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864627 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864630 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864634 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864637 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864641 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:33:50.869672 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864644 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864646 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864649 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864651 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864655 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864657 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864660 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864662 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864664 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864667 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864672 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864675 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864678 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:50.864680 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.864685 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:33:50.870178 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.865392 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:33:50.870551 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.867623 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:33:50.870551 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.868437 2577 server.go:1019] "Starting client certificate rotation" Apr 22 15:33:50.870551 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.868538 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:33:50.870551 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.868572 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:33:50.894278 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.894258 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:33:50.899202 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.899186 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:33:50.913060 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.913037 2577 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:33:50.918440 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.918425 2577 log.go:25] "Validated CRI v1 image API" Apr 22 15:33:50.919835 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.919800 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:33:50.922560 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.922546 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:33:50.923929 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.923886 2577 fs.go:135] Filesystem UUIDs: map[1508cece-c1a2-4e71-8bdd-3d3c519c2d19:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 cd6ce5f3-7948-42fa-be93-9f2691ac8c11:/dev/nvme0n1p4] Apr 22 15:33:50.923980 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.923930 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:33:50.929640 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.929528 2577 manager.go:217] Machine: {Timestamp:2026-04-22 15:33:50.927503067 +0000 UTC m=+0.397053857 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3110960 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec219c0decbb9fa0915a6d50adbd2725 SystemUUID:ec219c0d-ecbb-9fa0-915a-6d50adbd2725 BootID:646023eb-0346-4e82-a1c4-d507ede137af Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:dc:d8:83:31:27 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:dc:d8:83:31:27 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:23:41:f3:9b:58 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:33:50.929640 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.929635 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:33:50.929734 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.929705 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:33:50.930766 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.930740 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:33:50.930918 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.930769 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-30.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:33:50.930969 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.930926 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:33:50.930969 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.930935 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:33:50.930969 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.930951 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:33:50.931706 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.931697 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:33:50.932502 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.932492 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:33:50.932727 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.932718 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:33:50.935323 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.935312 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:33:50.935357 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.935327 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:33:50.935357 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.935339 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:33:50.935357 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.935351 2577 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:33:50.935433 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.935359 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:33:50.936425 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.936410 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:33:50.936488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.936435 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:33:50.940511 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.940493 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:33:50.941701 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.941686 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:33:50.942171 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.942154 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v6bpl" Apr 22 15:33:50.943451 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943437 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:33:50.943451 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943454 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943460 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943465 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943472 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943477 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943483 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943488 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943495 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943500 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943508 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:33:50.943546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.943517 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:33:50.944474 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.944464 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:33:50.944474 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.944476 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:33:50.947880 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.947866 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:33:50.947967 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.947920 2577 server.go:1295] "Started kubelet" Apr 22 15:33:50.948063 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.948019 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:33:50.948174 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.948079 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:33:50.948563 ip-10-0-143-30 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:33:50.950361 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.950327 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:33:50.951336 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.951318 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:33:50.952030 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.952007 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v6bpl" Apr 22 15:33:50.952555 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.952527 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-30.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 15:33:50.952555 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.952554 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:33:50.952690 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:50.952640 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-30.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 15:33:50.952784 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:50.952763 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 15:33:50.958046 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.958031 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:33:50.958270 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.958244 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:33:50.958635 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.958618 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:33:50.958921 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.958649 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:33:50.958921 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.958718 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:33:50.958921 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.958752 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:33:50.958921 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.958761 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:33:50.958921 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:50.958834 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:50.960743 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.960727 2577 factory.go:153] Registering CRI-O factory Apr 22 15:33:50.960855 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.960846 2577 factory.go:223] Registration of the crio container factory successfully Apr 22 15:33:50.960991 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.960982 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:33:50.961067 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.961058 2577 factory.go:55] Registering systemd factory Apr 22 15:33:50.961128 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.961121 2577 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:33:50.961198 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.961192 2577 factory.go:103] Registering Raw factory Apr 22 15:33:50.961267 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.961261 2577 manager.go:1196] Started watching for new ooms in manager Apr 22 15:33:50.961662 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.961640 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:50.961747 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.961738 2577 manager.go:319] Starting recovery of all containers Apr 22 15:33:50.961881 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:50.961861 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 15:33:50.966004 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:50.965982 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-30.ec2.internal\" not found" node="ip-10-0-143-30.ec2.internal" Apr 22 15:33:50.970039 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.969794 2577 manager.go:324] Recovery completed Apr 22 15:33:50.971277 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:50.971247 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 15:33:50.974000 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.973985 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:50.976641 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.976627 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:50.976724 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.976656 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:50.976724 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.976684 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:50.977122 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.977108 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:33:50.977122 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.977121 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:33:50.977242 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.977139 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:33:50.979371 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.979358 2577 policy_none.go:49] "None policy: Start" Apr 22 15:33:50.979434 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.979376 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:33:50.979434 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:50.979388 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:33:51.018454 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.018440 2577 manager.go:341] "Starting Device Plugin manager" Apr 22 15:33:51.026004 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.018470 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:33:51.026004 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.018482 2577 server.go:85] "Starting device plugin registration server" Apr 22 15:33:51.026004 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.018675 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:33:51.026004 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.018684 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:33:51.026004 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.018777 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:33:51.026004 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.018855 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:33:51.026004 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.018865 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:33:51.026004 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.019290 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:33:51.026004 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.019318 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.078826 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.078798 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:33:51.080151 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.080134 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:33:51.080225 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.080164 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:33:51.080225 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.080182 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:33:51.080225 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.080188 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:33:51.080225 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.080217 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:33:51.083281 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.083262 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:51.119045 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.119006 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:51.119926 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.119892 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:51.119995 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.119939 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:51.119995 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.119950 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:51.119995 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.119969 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.128477 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.128465 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.128532 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.128483 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-30.ec2.internal\": node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.142595 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.142577 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.180418 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.180398 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal"] Apr 22 15:33:51.180486 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.180471 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:51.181746 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.181733 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:51.181819 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.181755 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:51.181819 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.181768 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:51.183036 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.183024 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:51.183167 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.183154 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.183232 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.183182 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:51.183626 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.183612 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:51.183677 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.183631 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:51.183677 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.183641 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:51.183677 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.183657 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:51.183790 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.183680 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:51.183790 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.183690 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:51.184934 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.184919 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.184988 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.184950 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:33:51.185610 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.185595 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:33:51.185681 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.185624 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:33:51.185681 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.185639 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:33:51.208412 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.208395 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-30.ec2.internal\" not found" node="ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.212511 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.212498 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-30.ec2.internal\" not found" node="ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.243634 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.243612 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.260455 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.260436 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c7c7590a669d45bd157718172f22e2c3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal\" (UID: \"c7c7590a669d45bd157718172f22e2c3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.260548 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.260469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7c7590a669d45bd157718172f22e2c3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal\" (UID: \"c7c7590a669d45bd157718172f22e2c3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.260548 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.260495 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7dcbdaa34312fcd881c943ed9a362c63-config\") pod \"kube-apiserver-proxy-ip-10-0-143-30.ec2.internal\" (UID: \"7dcbdaa34312fcd881c943ed9a362c63\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.344118 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.344099 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.361472 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.361450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c7c7590a669d45bd157718172f22e2c3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal\" (UID: \"c7c7590a669d45bd157718172f22e2c3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.361545 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.361477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c7c7590a669d45bd157718172f22e2c3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal\" (UID: \"c7c7590a669d45bd157718172f22e2c3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.361545 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.361507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7c7590a669d45bd157718172f22e2c3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal\" (UID: \"c7c7590a669d45bd157718172f22e2c3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.361545 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.361534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7dcbdaa34312fcd881c943ed9a362c63-config\") pod \"kube-apiserver-proxy-ip-10-0-143-30.ec2.internal\" (UID: \"7dcbdaa34312fcd881c943ed9a362c63\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.361649 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.361564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7dcbdaa34312fcd881c943ed9a362c63-config\") pod \"kube-apiserver-proxy-ip-10-0-143-30.ec2.internal\" (UID: \"7dcbdaa34312fcd881c943ed9a362c63\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.361649 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.361592 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7c7590a669d45bd157718172f22e2c3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal\" (UID: \"c7c7590a669d45bd157718172f22e2c3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.444836 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.444795 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.510228 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.510206 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.515061 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.515047 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal" Apr 22 15:33:51.545531 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.545504 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.646065 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.646042 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.746559 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.746516 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.847065 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.847048 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.868508 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.868490 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:33:51.869027 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.868612 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:33:51.869027 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.868671 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:33:51.948111 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:51.948075 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:51.957358 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.957313 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:28:50 +0000 UTC" deadline="2027-10-26 13:42:09.649420665 +0000 UTC" Apr 22 15:33:51.957513 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.957360 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13246h8m17.692065239s" Apr 22 15:33:51.958465 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.958451 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:33:51.972198 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.972179 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:33:51.995591 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:51.995569 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-kbqnv" Apr 22 15:33:52.003300 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.003262 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-kbqnv" Apr 22 15:33:52.038937 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:52.038913 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c7590a669d45bd157718172f22e2c3.slice/crio-766151e6fc760c2ec14283c8d9c153fea45bfebf8e3804988abfa7a4ddea5a22 WatchSource:0}: Error finding container 766151e6fc760c2ec14283c8d9c153fea45bfebf8e3804988abfa7a4ddea5a22: Status 404 returned error can't find the container with id 766151e6fc760c2ec14283c8d9c153fea45bfebf8e3804988abfa7a4ddea5a22 Apr 22 15:33:52.039140 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:52.039125 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dcbdaa34312fcd881c943ed9a362c63.slice/crio-ba124f82abc3357c2cd211da1efc8eaf3eaa36801371223abfa6c8b3795b5503 WatchSource:0}: Error finding container ba124f82abc3357c2cd211da1efc8eaf3eaa36801371223abfa6c8b3795b5503: Status 404 returned error can't find the container with id ba124f82abc3357c2cd211da1efc8eaf3eaa36801371223abfa6c8b3795b5503 Apr 22 15:33:52.044485 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.044472 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:33:52.049241 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:52.049222 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:52.082409 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.082370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal" event={"ID":"7dcbdaa34312fcd881c943ed9a362c63","Type":"ContainerStarted","Data":"ba124f82abc3357c2cd211da1efc8eaf3eaa36801371223abfa6c8b3795b5503"} Apr 22 15:33:52.083196 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.083171 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" event={"ID":"c7c7590a669d45bd157718172f22e2c3","Type":"ContainerStarted","Data":"766151e6fc760c2ec14283c8d9c153fea45bfebf8e3804988abfa7a4ddea5a22"} Apr 22 15:33:52.129634 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.129615 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:52.149620 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:52.149601 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-30.ec2.internal\" not found" Apr 22 15:33:52.230307 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.230286 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:52.258864 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.258823 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" Apr 22 15:33:52.270929 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.270910 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:33:52.271745 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.271731 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal" Apr 22 15:33:52.280019 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.279999 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:33:52.802557 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.802520 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:52.936120 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.936077 2577 apiserver.go:52] "Watching apiserver" Apr 22 15:33:52.942784 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.942759 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:33:52.943120 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.943097 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal","openshift-cluster-node-tuning-operator/tuned-vmhqd","openshift-image-registry/node-ca-wkqg5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal","openshift-multus/multus-additional-cni-plugins-x6qrc","openshift-network-operator/iptables-alerter-wd2xg","kube-system/konnectivity-agent-ljlbr","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz","openshift-dns/node-resolver-vqw4p","openshift-multus/multus-gxvvw","openshift-multus/network-metrics-daemon-zzcmv","openshift-network-diagnostics/network-check-target-zhpd9","openshift-ovn-kubernetes/ovnkube-node-4qpdw"] Apr 22 15:33:52.944441 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.944422 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.945695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.945675 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.946944 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.946925 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:52.948165 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.948146 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:52.949216 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.949197 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:52.950294 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.950277 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:33:52.951272 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.951237 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:33:52.951367 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.951282 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:33:52.951367 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.951354 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:33:52.951513 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.951499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:52.951914 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.951644 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:33:52.951914 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.951822 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-m6qv5\"" Apr 22 15:33:52.952730 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.952710 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:52.952813 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.952775 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:33:52.953825 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.953647 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:33:52.953825 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.953703 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gmt97\"" Apr 22 15:33:52.954092 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.954077 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:33:52.955295 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.955275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:52.955384 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.955346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:33:52.955384 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:52.955357 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:33:52.955489 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:52.955410 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:33:52.957329 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.957310 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.957420 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.957348 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bwr7v\"" Apr 22 15:33:52.957420 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.957376 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:33:52.957526 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.957418 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:33:52.961030 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.961011 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:33:52.961334 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.961318 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:33:52.961530 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.961516 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:33:52.961617 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.961581 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:33:52.961617 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.961610 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zbbc9\"" Apr 22 15:33:52.961885 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.961809 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:33:52.962445 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.962423 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:33:52.962445 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.962441 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:33:52.962585 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.962446 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zdcdt\"" Apr 22 15:33:52.962585 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.962555 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:33:52.962685 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.962627 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:33:52.962685 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.962633 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:33:52.962685 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.962645 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-222gn\"" Apr 22 15:33:52.962971 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.962950 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wthx2\"" Apr 22 15:33:52.967506 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.967089 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:33:52.967506 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.967167 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ldwqs\"" Apr 22 15:33:52.967506 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.967196 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:33:52.967506 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.967217 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:33:52.969879 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.969861 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-sysconfig\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.969981 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.969888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/168184b4-2b79-4b84-9cff-2a4fe584c2ab-tmp\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.969981 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.969931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0acda737-f5c9-4897-bd5e-94296fc02284-serviceca\") pod \"node-ca-wkqg5\" (UID: \"0acda737-f5c9-4897-bd5e-94296fc02284\") " pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:52.969981 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.969957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-cni-netd\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970119 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.969984 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2rt5\" (UniqueName: \"kubernetes.io/projected/bf7bc51e-13ec-42f9-912c-b51cd7134006-kube-api-access-j2rt5\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970119 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-var-lib-cni-bin\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.970119 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:52.970119 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-device-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:52.970303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/45497d38-5523-4037-9d5e-b2d5cf55efc2-tmp-dir\") pod \"node-resolver-vqw4p\" (UID: \"45497d38-5523-4037-9d5e-b2d5cf55efc2\") " pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:52.970303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970139 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-etc-openvswitch\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970160 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-node-log\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf7bc51e-13ec-42f9-912c-b51cd7134006-ovnkube-config\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-hostroot\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.970303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-log-socket\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01fd759b-aafb-49c6-a60e-5424150b1157-host-slash\") pod \"iptables-alerter-wd2xg\" (UID: \"01fd759b-aafb-49c6-a60e-5424150b1157\") " pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:52.970303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970297 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0acda737-f5c9-4897-bd5e-94296fc02284-host\") pod \"node-ca-wkqg5\" (UID: \"0acda737-f5c9-4897-bd5e-94296fc02284\") " pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-run-ovn\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-multus-cni-dir\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-os-release\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9c1f5b49-88a2-49ca-a478-10b546545331-konnectivity-ca\") pod \"konnectivity-agent-ljlbr\" (UID: \"9c1f5b49-88a2-49ca-a478-10b546545331\") " pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-systemd-units\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-run-systemd\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2pp\" (UniqueName: \"kubernetes.io/projected/8644527b-1d6e-4618-95f6-57427939a8e7-kube-api-access-kh2pp\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-lib-modules\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/45497d38-5523-4037-9d5e-b2d5cf55efc2-hosts-file\") pod \"node-resolver-vqw4p\" (UID: \"45497d38-5523-4037-9d5e-b2d5cf55efc2\") " pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970613 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxlv6\" (UniqueName: \"kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6\") pod \"network-check-target-zhpd9\" (UID: \"f537b0eb-8087-4572-b237-83ff59e51f13\") " pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-cni-bin\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.970675 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf7bc51e-13ec-42f9-912c-b51cd7134006-env-overrides\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-cnibin\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-kubernetes\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970749 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-systemd\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-cni-binary-copy\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9c1f5b49-88a2-49ca-a478-10b546545331-agent-certs\") pod \"konnectivity-agent-ljlbr\" (UID: \"9c1f5b49-88a2-49ca-a478-10b546545331\") " pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970835 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-var-lib-openvswitch\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf7bc51e-13ec-42f9-912c-b51cd7134006-ovn-node-metrics-cert\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8644527b-1d6e-4618-95f6-57427939a8e7-cni-binary-copy\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-modprobe-d\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-var-lib-kubelet\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.970977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlgl9\" (UniqueName: \"kubernetes.io/projected/01fd759b-aafb-49c6-a60e-5424150b1157-kube-api-access-hlgl9\") pod \"iptables-alerter-wd2xg\" (UID: \"01fd759b-aafb-49c6-a60e-5424150b1157\") " pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971028 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-run-openvswitch\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-run-k8s-cni-cncf-io\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971062 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-multus-conf-dir\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-run-multus-certs\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971115 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-sysctl-conf\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971135 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-system-cni-dir\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971158 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971178 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-socket-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-multus-socket-dir-parent\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-run\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-sys\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqqt5\" (UniqueName: \"kubernetes.io/projected/f7958036-9067-47bb-91dc-dc565feb289a-kube-api-access-dqqt5\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971289 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8644527b-1d6e-4618-95f6-57427939a8e7-multus-daemon-config\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971326 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-etc-kubernetes\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-sysctl-d\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-tuned\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971438 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-system-cni-dir\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971460 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-run-netns\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.971970 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-var-lib-kubelet\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-host\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971528 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bppb2\" (UniqueName: \"kubernetes.io/projected/168184b4-2b79-4b84-9cff-2a4fe584c2ab-kube-api-access-bppb2\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4qm\" (UniqueName: \"kubernetes.io/projected/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-kube-api-access-mx4qm\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7f2w\" (UniqueName: \"kubernetes.io/projected/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-kube-api-access-p7f2w\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-kubelet\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971670 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-slash\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-registration-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-etc-selinux\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxs9r\" (UniqueName: \"kubernetes.io/projected/45497d38-5523-4037-9d5e-b2d5cf55efc2-kube-api-access-cxs9r\") pod \"node-resolver-vqw4p\" (UID: \"45497d38-5523-4037-9d5e-b2d5cf55efc2\") " pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp2sr\" (UniqueName: \"kubernetes.io/projected/0acda737-f5c9-4897-bd5e-94296fc02284-kube-api-access-cp2sr\") pod \"node-ca-wkqg5\" (UID: \"0acda737-f5c9-4897-bd5e-94296fc02284\") " pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-var-lib-cni-multus\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971892 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-cnibin\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971961 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01fd759b-aafb-49c6-a60e-5424150b1157-iptables-alerter-script\") pod \"iptables-alerter-wd2xg\" (UID: \"01fd759b-aafb-49c6-a60e-5424150b1157\") " pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.971988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-sys-fs\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:52.972695 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-run-netns\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.973388 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf7bc51e-13ec-42f9-912c-b51cd7134006-ovnkube-script-lib\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:52.973388 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-os-release\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:52.973388 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972088 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:33:52.973388 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972183 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:33:52.973388 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972215 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:33:52.973388 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972288 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:33:52.973388 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972303 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:33:52.973388 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972480 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:33:52.973388 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.972510 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-db5kj\"" Apr 22 15:33:52.977783 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:52.977766 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:33:53.003890 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.003868 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:28:51 +0000 UTC" deadline="2027-10-19 04:20:23.645782534 +0000 UTC" Apr 22 15:33:53.003890 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.003889 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13068h46m30.641895988s" Apr 22 15:33:53.072374 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.072303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9c1f5b49-88a2-49ca-a478-10b546545331-agent-certs\") pod \"konnectivity-agent-ljlbr\" (UID: \"9c1f5b49-88a2-49ca-a478-10b546545331\") " pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:33:53.072374 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.072337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-var-lib-openvswitch\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.072374 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.072368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf7bc51e-13ec-42f9-912c-b51cd7134006-ovn-node-metrics-cert\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.072552 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.072455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-var-lib-openvswitch\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.072834 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.072803 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:33:53.073106 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8644527b-1d6e-4618-95f6-57427939a8e7-cni-binary-copy\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073221 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8644527b-1d6e-4618-95f6-57427939a8e7-cni-binary-copy\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073221 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-modprobe-d\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073330 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-var-lib-kubelet\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073330 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.073330 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlgl9\" (UniqueName: \"kubernetes.io/projected/01fd759b-aafb-49c6-a60e-5424150b1157-kube-api-access-hlgl9\") pod \"iptables-alerter-wd2xg\" (UID: \"01fd759b-aafb-49c6-a60e-5424150b1157\") " pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:53.073330 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073309 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-run-openvswitch\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-run-k8s-cni-cncf-io\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-multus-conf-dir\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-modprobe-d\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073370 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-run-multus-certs\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-sysctl-conf\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-system-cni-dir\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-var-lib-kubelet\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073418 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073433 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-socket-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-multus-socket-dir-parent\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-run\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073520 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-sys\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073544 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqqt5\" (UniqueName: \"kubernetes.io/projected/f7958036-9067-47bb-91dc-dc565feb289a-kube-api-access-dqqt5\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8644527b-1d6e-4618-95f6-57427939a8e7-multus-daemon-config\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073605 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-etc-kubernetes\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-sysctl-d\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-tuned\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-system-cni-dir\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-run-netns\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-var-lib-kubelet\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-host\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bppb2\" (UniqueName: \"kubernetes.io/projected/168184b4-2b79-4b84-9cff-2a4fe584c2ab-kube-api-access-bppb2\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4qm\" (UniqueName: \"kubernetes.io/projected/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-kube-api-access-mx4qm\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7f2w\" (UniqueName: \"kubernetes.io/projected/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-kube-api-access-p7f2w\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-kubelet\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-slash\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.073931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-registration-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-etc-selinux\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.073977 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxs9r\" (UniqueName: \"kubernetes.io/projected/45497d38-5523-4037-9d5e-b2d5cf55efc2-kube-api-access-cxs9r\") pod \"node-resolver-vqw4p\" (UID: \"45497d38-5523-4037-9d5e-b2d5cf55efc2\") " pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074064 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp2sr\" (UniqueName: \"kubernetes.io/projected/0acda737-f5c9-4897-bd5e-94296fc02284-kube-api-access-cp2sr\") pod \"node-ca-wkqg5\" (UID: \"0acda737-f5c9-4897-bd5e-94296fc02284\") " pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-var-lib-cni-multus\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074148 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-etc-kubernetes\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074212 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-run-openvswitch\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-cnibin\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01fd759b-aafb-49c6-a60e-5424150b1157-iptables-alerter-script\") pod \"iptables-alerter-wd2xg\" (UID: \"01fd759b-aafb-49c6-a60e-5424150b1157\") " pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-sys-fs\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-sysctl-d\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-run-netns\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf7bc51e-13ec-42f9-912c-b51cd7134006-ovnkube-script-lib\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-run-netns\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-os-release\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.074515 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074411 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-run-k8s-cni-cncf-io\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-sysconfig\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/168184b4-2b79-4b84-9cff-2a4fe584c2ab-tmp\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0acda737-f5c9-4897-bd5e-94296fc02284-serviceca\") pod \"node-ca-wkqg5\" (UID: \"0acda737-f5c9-4897-bd5e-94296fc02284\") " pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-cni-netd\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2rt5\" (UniqueName: \"kubernetes.io/projected/bf7bc51e-13ec-42f9-912c-b51cd7134006-kube-api-access-j2rt5\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-var-lib-cni-bin\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-device-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/45497d38-5523-4037-9d5e-b2d5cf55efc2-tmp-dir\") pod \"node-resolver-vqw4p\" (UID: \"45497d38-5523-4037-9d5e-b2d5cf55efc2\") " pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-etc-openvswitch\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-node-log\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf7bc51e-13ec-42f9-912c-b51cd7134006-ovnkube-config\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074749 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-hostroot\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-log-socket\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01fd759b-aafb-49c6-a60e-5424150b1157-host-slash\") pod \"iptables-alerter-wd2xg\" (UID: \"01fd759b-aafb-49c6-a60e-5424150b1157\") " pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0acda737-f5c9-4897-bd5e-94296fc02284-host\") pod \"node-ca-wkqg5\" (UID: \"0acda737-f5c9-4897-bd5e-94296fc02284\") " pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:53.074992 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074882 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-run-ovn\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-multus-cni-dir\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-os-release\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075013 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9c1f5b49-88a2-49ca-a478-10b546545331-konnectivity-ca\") pod \"konnectivity-agent-ljlbr\" (UID: \"9c1f5b49-88a2-49ca-a478-10b546545331\") " pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-systemd-units\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-run-systemd\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075130 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2pp\" (UniqueName: \"kubernetes.io/projected/8644527b-1d6e-4618-95f6-57427939a8e7-kube-api-access-kh2pp\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-lib-modules\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075237 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-sys\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075217 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-cni-netd\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/45497d38-5523-4037-9d5e-b2d5cf55efc2-hosts-file\") pod \"node-resolver-vqw4p\" (UID: \"45497d38-5523-4037-9d5e-b2d5cf55efc2\") " pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-system-cni-dir\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075314 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlv6\" (UniqueName: \"kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6\") pod \"network-check-target-zhpd9\" (UID: \"f537b0eb-8087-4572-b237-83ff59e51f13\") " pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/45497d38-5523-4037-9d5e-b2d5cf55efc2-hosts-file\") pod \"node-resolver-vqw4p\" (UID: \"45497d38-5523-4037-9d5e-b2d5cf55efc2\") " pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-run-netns\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-cni-bin\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.075666 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075386 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-var-lib-kubelet\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf7bc51e-13ec-42f9-912c-b51cd7134006-env-overrides\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-cni-bin\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-host\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-cnibin\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-multus-conf-dir\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-kubernetes\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-run-multus-certs\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-systemd\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-cni-binary-copy\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-systemd\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-node-log\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.075783 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-etc-openvswitch\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-os-release\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0acda737-f5c9-4897-bd5e-94296fc02284-host\") pod \"node-ca-wkqg5\" (UID: \"0acda737-f5c9-4897-bd5e-94296fc02284\") " pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076109 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-hostroot\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.076164 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.076286 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs podName:f7958036-9067-47bb-91dc-dc565feb289a nodeName:}" failed. No retries permitted until 2026-04-22 15:33:53.576249335 +0000 UTC m=+3.045800128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs") pod "network-metrics-daemon-zzcmv" (UID: "f7958036-9067-47bb-91dc-dc565feb289a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:53.076488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bf7bc51e-13ec-42f9-912c-b51cd7134006-ovn-node-metrics-cert\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-run-ovn\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-run-ovn-kubernetes\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-sysctl-conf\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076696 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-multus-cni-dir\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-system-cni-dir\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-log-socket\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01fd759b-aafb-49c6-a60e-5424150b1157-host-slash\") pod \"iptables-alerter-wd2xg\" (UID: \"01fd759b-aafb-49c6-a60e-5424150b1157\") " pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9c1f5b49-88a2-49ca-a478-10b546545331-agent-certs\") pod \"konnectivity-agent-ljlbr\" (UID: \"9c1f5b49-88a2-49ca-a478-10b546545331\") " pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-tuned\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.076892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.074450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-os-release\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.077313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.077202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-var-lib-cni-multus\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.077841 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.077302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bf7bc51e-13ec-42f9-912c-b51cd7134006-ovnkube-config\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.077841 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.077383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-kubelet\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.077841 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.077430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-sys-fs\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.077841 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.077439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.077841 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.077453 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-slash\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.078121 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.078083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-cnibin\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.078176 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.078129 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-kubernetes\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.078265 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.078244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-device-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.078362 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.078349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-socket-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.078413 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.078390 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-multus-socket-dir-parent\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.078472 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.078434 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-registration-dir\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.078619 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.078568 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-run\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.078724 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.078697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-cnibin\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.078805 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.078778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-etc-selinux\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.079182 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.079165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-run-systemd\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.079288 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.079257 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-systemd-units\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.079384 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.079368 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-cni-binary-copy\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.079469 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.079446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0acda737-f5c9-4897-bd5e-94296fc02284-serviceca\") pod \"node-ca-wkqg5\" (UID: \"0acda737-f5c9-4897-bd5e-94296fc02284\") " pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:53.079541 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.079468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bf7bc51e-13ec-42f9-912c-b51cd7134006-env-overrides\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.080152 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.080133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-etc-sysconfig\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.080250 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.080187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9c1f5b49-88a2-49ca-a478-10b546545331-konnectivity-ca\") pod \"konnectivity-agent-ljlbr\" (UID: \"9c1f5b49-88a2-49ca-a478-10b546545331\") " pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:33:53.080470 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.080453 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/45497d38-5523-4037-9d5e-b2d5cf55efc2-tmp-dir\") pod \"node-resolver-vqw4p\" (UID: \"45497d38-5523-4037-9d5e-b2d5cf55efc2\") " pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:53.080880 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.080841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8644527b-1d6e-4618-95f6-57427939a8e7-multus-daemon-config\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.081033 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.080965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf7bc51e-13ec-42f9-912c-b51cd7134006-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.081115 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.081050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8644527b-1d6e-4618-95f6-57427939a8e7-host-var-lib-cni-bin\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.082494 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.082475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/168184b4-2b79-4b84-9cff-2a4fe584c2ab-lib-modules\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.084089 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.084064 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01fd759b-aafb-49c6-a60e-5424150b1157-iptables-alerter-script\") pod \"iptables-alerter-wd2xg\" (UID: \"01fd759b-aafb-49c6-a60e-5424150b1157\") " pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:53.085195 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.085177 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:33:53.085375 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.085199 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:33:53.085375 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.085220 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxlv6 for pod openshift-network-diagnostics/network-check-target-zhpd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:53.085375 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.085283 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bf7bc51e-13ec-42f9-912c-b51cd7134006-ovnkube-script-lib\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.085777 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.085514 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6 podName:f537b0eb-8087-4572-b237-83ff59e51f13 nodeName:}" failed. No retries permitted until 2026-04-22 15:33:53.58547189 +0000 UTC m=+3.055022667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pxlv6" (UniqueName: "kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6") pod "network-check-target-zhpd9" (UID: "f537b0eb-8087-4572-b237-83ff59e51f13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:53.086763 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.086220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/168184b4-2b79-4b84-9cff-2a4fe584c2ab-tmp\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.086763 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.086663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqqt5\" (UniqueName: \"kubernetes.io/projected/f7958036-9067-47bb-91dc-dc565feb289a-kube-api-access-dqqt5\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:53.087600 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.087503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4qm\" (UniqueName: \"kubernetes.io/projected/2b6e7ecf-d54b-4238-8dd4-a8502eb2627e-kube-api-access-mx4qm\") pod \"multus-additional-cni-plugins-x6qrc\" (UID: \"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e\") " pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.089673 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.089647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp2sr\" (UniqueName: \"kubernetes.io/projected/0acda737-f5c9-4897-bd5e-94296fc02284-kube-api-access-cp2sr\") pod \"node-ca-wkqg5\" (UID: \"0acda737-f5c9-4897-bd5e-94296fc02284\") " pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:53.089802 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.089780 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2rt5\" (UniqueName: \"kubernetes.io/projected/bf7bc51e-13ec-42f9-912c-b51cd7134006-kube-api-access-j2rt5\") pod \"ovnkube-node-4qpdw\" (UID: \"bf7bc51e-13ec-42f9-912c-b51cd7134006\") " pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.089949 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.089884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bppb2\" (UniqueName: \"kubernetes.io/projected/168184b4-2b79-4b84-9cff-2a4fe584c2ab-kube-api-access-bppb2\") pod \"tuned-vmhqd\" (UID: \"168184b4-2b79-4b84-9cff-2a4fe584c2ab\") " pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.090116 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.090094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxs9r\" (UniqueName: \"kubernetes.io/projected/45497d38-5523-4037-9d5e-b2d5cf55efc2-kube-api-access-cxs9r\") pod \"node-resolver-vqw4p\" (UID: \"45497d38-5523-4037-9d5e-b2d5cf55efc2\") " pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:53.090184 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.090105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2pp\" (UniqueName: \"kubernetes.io/projected/8644527b-1d6e-4618-95f6-57427939a8e7-kube-api-access-kh2pp\") pod \"multus-gxvvw\" (UID: \"8644527b-1d6e-4618-95f6-57427939a8e7\") " pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.090677 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.090637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7f2w\" (UniqueName: \"kubernetes.io/projected/f7ae90cc-3bb3-4d83-8b18-0478053cfa90-kube-api-access-p7f2w\") pod \"aws-ebs-csi-driver-node-x7pkz\" (UID: \"f7ae90cc-3bb3-4d83-8b18-0478053cfa90\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.091721 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.091680 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlgl9\" (UniqueName: \"kubernetes.io/projected/01fd759b-aafb-49c6-a60e-5424150b1157-kube-api-access-hlgl9\") pod \"iptables-alerter-wd2xg\" (UID: \"01fd759b-aafb-49c6-a60e-5424150b1157\") " pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:53.257517 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.257487 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gxvvw" Apr 22 15:33:53.265320 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.265297 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" Apr 22 15:33:53.271745 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.271726 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wkqg5" Apr 22 15:33:53.278401 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.278382 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" Apr 22 15:33:53.284923 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.284887 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wd2xg" Apr 22 15:33:53.291505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.291488 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:33:53.298087 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.298071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" Apr 22 15:33:53.304599 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.304581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vqw4p" Apr 22 15:33:53.309082 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.309066 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:33:53.578948 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.578919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:53.579111 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.579054 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:53.579111 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.579106 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs podName:f7958036-9067-47bb-91dc-dc565feb289a nodeName:}" failed. No retries permitted until 2026-04-22 15:33:54.579092252 +0000 UTC m=+4.048643029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs") pod "network-metrics-daemon-zzcmv" (UID: "f7958036-9067-47bb-91dc-dc565feb289a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:53.679454 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:53.679431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlv6\" (UniqueName: \"kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6\") pod \"network-check-target-zhpd9\" (UID: \"f537b0eb-8087-4572-b237-83ff59e51f13\") " pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:33:53.679545 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.679533 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:33:53.679596 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.679548 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:33:53.679596 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.679556 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxlv6 for pod openshift-network-diagnostics/network-check-target-zhpd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:53.679655 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:53.679597 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6 podName:f537b0eb-8087-4572-b237-83ff59e51f13 nodeName:}" failed. No retries permitted until 2026-04-22 15:33:54.679585792 +0000 UTC m=+4.149136569 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxlv6" (UniqueName: "kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6") pod "network-check-target-zhpd9" (UID: "f537b0eb-8087-4572-b237-83ff59e51f13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:53.695283 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:53.695261 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c1f5b49_88a2_49ca_a478_10b546545331.slice/crio-983a5664215703187478ed4cc714e9ecacf0a571c75a5ffd75af64359b484a3f WatchSource:0}: Error finding container 983a5664215703187478ed4cc714e9ecacf0a571c75a5ffd75af64359b484a3f: Status 404 returned error can't find the container with id 983a5664215703187478ed4cc714e9ecacf0a571c75a5ffd75af64359b484a3f Apr 22 15:33:53.697169 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:53.697138 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7ae90cc_3bb3_4d83_8b18_0478053cfa90.slice/crio-5090209283eadd3d43343e18e5f3696b0f14d74d0f8484e68c355e925f2a11ff WatchSource:0}: Error finding container 5090209283eadd3d43343e18e5f3696b0f14d74d0f8484e68c355e925f2a11ff: Status 404 returned error can't find the container with id 5090209283eadd3d43343e18e5f3696b0f14d74d0f8484e68c355e925f2a11ff Apr 22 15:33:53.700105 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:53.700082 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b6e7ecf_d54b_4238_8dd4_a8502eb2627e.slice/crio-7e7ea6e07c2ce3f305f9d65c10707256084f75e16b181439c53e0f6364ab4b23 WatchSource:0}: Error finding container 7e7ea6e07c2ce3f305f9d65c10707256084f75e16b181439c53e0f6364ab4b23: Status 404 returned error can't find the container with id 7e7ea6e07c2ce3f305f9d65c10707256084f75e16b181439c53e0f6364ab4b23 Apr 22 15:33:53.700416 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:53.700390 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01fd759b_aafb_49c6_a60e_5424150b1157.slice/crio-88930e68b6354b57db046dd711b9366a6971311419931b63c4f7af844e3dc61e WatchSource:0}: Error finding container 88930e68b6354b57db046dd711b9366a6971311419931b63c4f7af844e3dc61e: Status 404 returned error can't find the container with id 88930e68b6354b57db046dd711b9366a6971311419931b63c4f7af844e3dc61e Apr 22 15:33:53.702188 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:53.702166 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168184b4_2b79_4b84_9cff_2a4fe584c2ab.slice/crio-e007e34a57afefc7135c99446e9e3636f15b18108ac1ed19346a5f2953741dcb WatchSource:0}: Error finding container e007e34a57afefc7135c99446e9e3636f15b18108ac1ed19346a5f2953741dcb: Status 404 returned error can't find the container with id e007e34a57afefc7135c99446e9e3636f15b18108ac1ed19346a5f2953741dcb Apr 22 15:33:53.703616 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:53.703546 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45497d38_5523_4037_9d5e_b2d5cf55efc2.slice/crio-0eb6f7a64e65e08acc231cec45bccc9a40f1d80b337ec485582dee07001bea78 WatchSource:0}: Error finding container 0eb6f7a64e65e08acc231cec45bccc9a40f1d80b337ec485582dee07001bea78: Status 404 returned error can't find the container with id 0eb6f7a64e65e08acc231cec45bccc9a40f1d80b337ec485582dee07001bea78 Apr 22 15:33:53.704334 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:53.704308 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf7bc51e_13ec_42f9_912c_b51cd7134006.slice/crio-98dc6f161e79d34ed37db10a0a104bed2d81dca7d01fbbf0fed9a9057faf8bac WatchSource:0}: Error finding container 98dc6f161e79d34ed37db10a0a104bed2d81dca7d01fbbf0fed9a9057faf8bac: Status 404 returned error can't find the container with id 98dc6f161e79d34ed37db10a0a104bed2d81dca7d01fbbf0fed9a9057faf8bac Apr 22 15:33:53.705471 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:53.705011 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8644527b_1d6e_4618_95f6_57427939a8e7.slice/crio-b60d0c32742d1e8f28a128acaebefbe3556092a659289b0d11a382680da3faa1 WatchSource:0}: Error finding container b60d0c32742d1e8f28a128acaebefbe3556092a659289b0d11a382680da3faa1: Status 404 returned error can't find the container with id b60d0c32742d1e8f28a128acaebefbe3556092a659289b0d11a382680da3faa1 Apr 22 15:33:53.706252 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:33:53.706067 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0acda737_f5c9_4897_bd5e_94296fc02284.slice/crio-a6231773094061450763f9439d012cb34cd6db20e3c6c1ba6f1d5336f26881e0 WatchSource:0}: Error finding container a6231773094061450763f9439d012cb34cd6db20e3c6c1ba6f1d5336f26881e0: Status 404 returned error can't find the container with id a6231773094061450763f9439d012cb34cd6db20e3c6c1ba6f1d5336f26881e0 Apr 22 15:33:54.004703 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.004392 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:28:51 +0000 UTC" deadline="2027-12-20 21:55:07.895089353 +0000 UTC" Apr 22 15:33:54.004703 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.004633 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14574h21m13.890461763s" Apr 22 15:33:54.088324 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.088265 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wd2xg" event={"ID":"01fd759b-aafb-49c6-a60e-5424150b1157","Type":"ContainerStarted","Data":"88930e68b6354b57db046dd711b9366a6971311419931b63c4f7af844e3dc61e"} Apr 22 15:33:54.092154 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.092100 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" event={"ID":"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e","Type":"ContainerStarted","Data":"7e7ea6e07c2ce3f305f9d65c10707256084f75e16b181439c53e0f6364ab4b23"} Apr 22 15:33:54.094094 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.094054 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" event={"ID":"bf7bc51e-13ec-42f9-912c-b51cd7134006","Type":"ContainerStarted","Data":"98dc6f161e79d34ed37db10a0a104bed2d81dca7d01fbbf0fed9a9057faf8bac"} Apr 22 15:33:54.097199 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.097178 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vqw4p" event={"ID":"45497d38-5523-4037-9d5e-b2d5cf55efc2","Type":"ContainerStarted","Data":"0eb6f7a64e65e08acc231cec45bccc9a40f1d80b337ec485582dee07001bea78"} Apr 22 15:33:54.108054 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.108028 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" event={"ID":"168184b4-2b79-4b84-9cff-2a4fe584c2ab","Type":"ContainerStarted","Data":"e007e34a57afefc7135c99446e9e3636f15b18108ac1ed19346a5f2953741dcb"} Apr 22 15:33:54.109674 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.109648 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" event={"ID":"f7ae90cc-3bb3-4d83-8b18-0478053cfa90","Type":"ContainerStarted","Data":"5090209283eadd3d43343e18e5f3696b0f14d74d0f8484e68c355e925f2a11ff"} Apr 22 15:33:54.111997 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.111927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ljlbr" event={"ID":"9c1f5b49-88a2-49ca-a478-10b546545331","Type":"ContainerStarted","Data":"983a5664215703187478ed4cc714e9ecacf0a571c75a5ffd75af64359b484a3f"} Apr 22 15:33:54.117848 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.117791 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal" event={"ID":"7dcbdaa34312fcd881c943ed9a362c63","Type":"ContainerStarted","Data":"0a408b1423c5fb47910ad8072bee17333d35a423ef28ccefa14913ccf64f5163"} Apr 22 15:33:54.125924 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.125866 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wkqg5" event={"ID":"0acda737-f5c9-4897-bd5e-94296fc02284","Type":"ContainerStarted","Data":"a6231773094061450763f9439d012cb34cd6db20e3c6c1ba6f1d5336f26881e0"} Apr 22 15:33:54.131188 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.131141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gxvvw" event={"ID":"8644527b-1d6e-4618-95f6-57427939a8e7","Type":"ContainerStarted","Data":"b60d0c32742d1e8f28a128acaebefbe3556092a659289b0d11a382680da3faa1"} Apr 22 15:33:54.594918 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.588600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:54.594918 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:54.588777 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:54.594918 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:54.588857 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs podName:f7958036-9067-47bb-91dc-dc565feb289a nodeName:}" failed. No retries permitted until 2026-04-22 15:33:56.588836029 +0000 UTC m=+6.058386818 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs") pod "network-metrics-daemon-zzcmv" (UID: "f7958036-9067-47bb-91dc-dc565feb289a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:54.690213 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:54.689601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlv6\" (UniqueName: \"kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6\") pod \"network-check-target-zhpd9\" (UID: \"f537b0eb-8087-4572-b237-83ff59e51f13\") " pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:33:54.690213 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:54.689761 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:33:54.690213 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:54.689782 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:33:54.690213 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:54.689793 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxlv6 for pod openshift-network-diagnostics/network-check-target-zhpd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:54.690213 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:54.689850 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6 podName:f537b0eb-8087-4572-b237-83ff59e51f13 nodeName:}" failed. No retries permitted until 2026-04-22 15:33:56.689831992 +0000 UTC m=+6.159382789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxlv6" (UniqueName: "kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6") pod "network-check-target-zhpd9" (UID: "f537b0eb-8087-4572-b237-83ff59e51f13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:55.081762 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:55.081002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:55.081762 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:55.081201 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:33:55.081762 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:55.081636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:33:55.081762 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:55.081724 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:33:55.145484 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:55.145439 2577 generic.go:358] "Generic (PLEG): container finished" podID="c7c7590a669d45bd157718172f22e2c3" containerID="9195e7790b91303d51d0a95f174ad5f3db155e46456eb89d2e7c277d93ade234" exitCode=0 Apr 22 15:33:55.145650 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:55.145589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" event={"ID":"c7c7590a669d45bd157718172f22e2c3","Type":"ContainerDied","Data":"9195e7790b91303d51d0a95f174ad5f3db155e46456eb89d2e7c277d93ade234"} Apr 22 15:33:55.161377 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:55.161319 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-30.ec2.internal" podStartSLOduration=3.161302105 podStartE2EDuration="3.161302105s" podCreationTimestamp="2026-04-22 15:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:33:54.131064663 +0000 UTC m=+3.600615471" watchObservedRunningTime="2026-04-22 15:33:55.161302105 +0000 UTC m=+4.630852906" Apr 22 15:33:56.158090 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:56.157937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" event={"ID":"c7c7590a669d45bd157718172f22e2c3","Type":"ContainerStarted","Data":"a0f11fb910dd6aa9df9789acfb2bf6e1d5d0a64fb2655bcc6bb937476829f73b"} Apr 22 15:33:56.171474 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:56.170601 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-30.ec2.internal" podStartSLOduration=4.170584198 podStartE2EDuration="4.170584198s" podCreationTimestamp="2026-04-22 15:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:33:56.170160866 +0000 UTC m=+5.639711677" watchObservedRunningTime="2026-04-22 15:33:56.170584198 +0000 UTC m=+5.640134998" Apr 22 15:33:56.606383 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:56.606294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:56.606533 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:56.606462 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:56.606533 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:56.606518 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs podName:f7958036-9067-47bb-91dc-dc565feb289a nodeName:}" failed. No retries permitted until 2026-04-22 15:34:00.60649955 +0000 UTC m=+10.076050334 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs") pod "network-metrics-daemon-zzcmv" (UID: "f7958036-9067-47bb-91dc-dc565feb289a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:33:56.707164 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:56.707131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlv6\" (UniqueName: \"kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6\") pod \"network-check-target-zhpd9\" (UID: \"f537b0eb-8087-4572-b237-83ff59e51f13\") " pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:33:56.707372 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:56.707310 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:33:56.707372 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:56.707332 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:33:56.707372 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:56.707345 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxlv6 for pod openshift-network-diagnostics/network-check-target-zhpd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:56.707515 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:56.707397 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6 podName:f537b0eb-8087-4572-b237-83ff59e51f13 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:00.70737889 +0000 UTC m=+10.176929680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxlv6" (UniqueName: "kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6") pod "network-check-target-zhpd9" (UID: "f537b0eb-8087-4572-b237-83ff59e51f13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:33:57.084033 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:57.083570 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:57.084033 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:57.083706 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:33:57.084255 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:57.084091 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:33:57.084255 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:57.084179 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:33:59.083087 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:59.083050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:33:59.083540 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:59.083197 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:33:59.083614 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:33:59.083546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:33:59.083663 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:33:59.083632 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:00.641831 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:00.641797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:00.642302 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:00.641982 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:00.642302 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:00.642057 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs podName:f7958036-9067-47bb-91dc-dc565feb289a nodeName:}" failed. No retries permitted until 2026-04-22 15:34:08.642035406 +0000 UTC m=+18.111586205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs") pod "network-metrics-daemon-zzcmv" (UID: "f7958036-9067-47bb-91dc-dc565feb289a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:00.742387 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:00.742291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlv6\" (UniqueName: \"kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6\") pod \"network-check-target-zhpd9\" (UID: \"f537b0eb-8087-4572-b237-83ff59e51f13\") " pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:00.742582 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:00.742446 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:00.742582 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:00.742466 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:00.742582 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:00.742478 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxlv6 for pod openshift-network-diagnostics/network-check-target-zhpd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:00.742582 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:00.742538 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6 podName:f537b0eb-8087-4572-b237-83ff59e51f13 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:08.742521185 +0000 UTC m=+18.212071966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxlv6" (UniqueName: "kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6") pod "network-check-target-zhpd9" (UID: "f537b0eb-8087-4572-b237-83ff59e51f13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:01.082277 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:01.082194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:01.082277 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:01.082262 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:01.082474 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:01.082280 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:01.082474 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:01.082390 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:03.081291 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:03.081214 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:03.081291 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:03.081258 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:03.081722 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:03.081356 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:03.081722 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:03.081498 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:05.080709 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:05.080671 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:05.081193 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:05.080679 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:05.081193 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:05.080786 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:05.081193 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:05.080920 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:07.080373 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:07.080343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:07.080804 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:07.080424 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:07.080804 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:07.080551 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:07.081030 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:07.081004 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:08.703040 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:08.703010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:08.703583 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:08.703184 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:08.703583 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:08.703263 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs podName:f7958036-9067-47bb-91dc-dc565feb289a nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.703241834 +0000 UTC m=+34.172792615 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs") pod "network-metrics-daemon-zzcmv" (UID: "f7958036-9067-47bb-91dc-dc565feb289a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:08.803634 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:08.803603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlv6\" (UniqueName: \"kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6\") pod \"network-check-target-zhpd9\" (UID: \"f537b0eb-8087-4572-b237-83ff59e51f13\") " pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:08.803783 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:08.803731 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:08.803783 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:08.803745 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:08.803783 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:08.803754 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxlv6 for pod openshift-network-diagnostics/network-check-target-zhpd9: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:08.803924 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:08.803801 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6 podName:f537b0eb-8087-4572-b237-83ff59e51f13 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.803787879 +0000 UTC m=+34.273338655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxlv6" (UniqueName: "kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6") pod "network-check-target-zhpd9" (UID: "f537b0eb-8087-4572-b237-83ff59e51f13") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:09.081478 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:09.081393 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:09.081478 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:09.081446 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:09.081689 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:09.081538 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:09.081960 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:09.081935 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:11.083802 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:11.083773 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:11.084266 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:11.083927 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:11.084456 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:11.084416 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:11.084576 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:11.084552 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:11.188962 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:11.186597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" event={"ID":"168184b4-2b79-4b84-9cff-2a4fe584c2ab","Type":"ContainerStarted","Data":"ee9643dd8d7e917288ce2afbcd8f199a1df9a9bc2ede292ce8e5c9795f90eb1e"} Apr 22 15:34:11.191263 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:11.191209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" event={"ID":"f7ae90cc-3bb3-4d83-8b18-0478053cfa90","Type":"ContainerStarted","Data":"7cab53c408b5c72cee73d1f959420b92e70b498e0d7442abe83477af5dd17773"} Apr 22 15:34:11.196191 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:11.195750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ljlbr" event={"ID":"9c1f5b49-88a2-49ca-a478-10b546545331","Type":"ContainerStarted","Data":"e59a9619ee6a0e3d96393e3ffc93f400094bad0eb27b3c6c2baf2b76d42e3db8"} Apr 22 15:34:11.198101 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:11.197498 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gxvvw" event={"ID":"8644527b-1d6e-4618-95f6-57427939a8e7","Type":"ContainerStarted","Data":"79a30b8f87dddd40bfd4c2b5c009a0de867f8325194b953acb7e51fba7edfafb"} Apr 22 15:34:11.216813 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:11.216771 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vmhqd" podStartSLOduration=2.9863831259999998 podStartE2EDuration="20.216756416s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:33:53.705202899 +0000 UTC m=+3.174753681" lastFinishedPulling="2026-04-22 15:34:10.935576191 +0000 UTC m=+20.405126971" observedRunningTime="2026-04-22 15:34:11.201869258 +0000 UTC m=+20.671420057" watchObservedRunningTime="2026-04-22 15:34:11.216756416 +0000 UTC m=+20.686307218" Apr 22 15:34:11.232017 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:11.231981 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ljlbr" podStartSLOduration=7.7745130620000005 podStartE2EDuration="20.231966895s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:33:53.697988213 +0000 UTC m=+3.167539004" lastFinishedPulling="2026-04-22 15:34:06.155442054 +0000 UTC m=+15.624992837" observedRunningTime="2026-04-22 15:34:11.217229812 +0000 UTC m=+20.686780612" watchObservedRunningTime="2026-04-22 15:34:11.231966895 +0000 UTC m=+20.701517695" Apr 22 15:34:11.232198 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:11.232173 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gxvvw" podStartSLOduration=2.985722554 podStartE2EDuration="20.232167242s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:33:53.706994423 +0000 UTC m=+3.176545207" lastFinishedPulling="2026-04-22 15:34:10.953439118 +0000 UTC m=+20.422989895" observedRunningTime="2026-04-22 15:34:11.231683115 +0000 UTC m=+20.701233915" watchObservedRunningTime="2026-04-22 15:34:11.232167242 +0000 UTC m=+20.701718042" Apr 22 15:34:12.202966 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.202932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wkqg5" event={"ID":"0acda737-f5c9-4897-bd5e-94296fc02284","Type":"ContainerStarted","Data":"379a99f5503079adaab417d3343b2f8209502be2e832e229b31eab57affd25fe"} Apr 22 15:34:12.204412 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.204389 2577 generic.go:358] "Generic (PLEG): container finished" podID="2b6e7ecf-d54b-4238-8dd4-a8502eb2627e" containerID="c92412d6df3841956828d2af4b34e7e27b793df8cf46cc697f5f79c408db6a1e" exitCode=0 Apr 22 15:34:12.204497 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.204457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" event={"ID":"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e","Type":"ContainerDied","Data":"c92412d6df3841956828d2af4b34e7e27b793df8cf46cc697f5f79c408db6a1e"} Apr 22 15:34:12.207102 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.207076 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" event={"ID":"bf7bc51e-13ec-42f9-912c-b51cd7134006","Type":"ContainerStarted","Data":"1be18bed6a7a79973ac5216f81c62b42c9a79129b342b4f53ddc1b9f5fc4b3cc"} Apr 22 15:34:12.207193 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.207110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" event={"ID":"bf7bc51e-13ec-42f9-912c-b51cd7134006","Type":"ContainerStarted","Data":"d02a5ba92014fdeaee3d647e4f30a639c787efdd12f68c7c4b83a2b417e93cc0"} Apr 22 15:34:12.207193 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.207126 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" event={"ID":"bf7bc51e-13ec-42f9-912c-b51cd7134006","Type":"ContainerStarted","Data":"17747736e493b4f01537e748c32f1dfda48a942600c916b0e98f18f5e7ba892d"} Apr 22 15:34:12.207193 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.207141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" event={"ID":"bf7bc51e-13ec-42f9-912c-b51cd7134006","Type":"ContainerStarted","Data":"864c22a87e03782891b9a02d75319b402f516c7d99de50a73e1d2d107622df3d"} Apr 22 15:34:12.207193 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.207156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" event={"ID":"bf7bc51e-13ec-42f9-912c-b51cd7134006","Type":"ContainerStarted","Data":"48e8a5316aa75afa16ff27383a27085a90aff40debeef2d9099ae25e979c331e"} Apr 22 15:34:12.207193 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.207169 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" event={"ID":"bf7bc51e-13ec-42f9-912c-b51cd7134006","Type":"ContainerStarted","Data":"839eb35dc2e941f4b45d35a9700e81cc092b15d9d8adbbcd551c969181ac039e"} Apr 22 15:34:12.208439 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.208414 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vqw4p" event={"ID":"45497d38-5523-4037-9d5e-b2d5cf55efc2","Type":"ContainerStarted","Data":"7463bb04eddc07d28e74902fcec1ad4278db4b9c4d707248f0b571d04e204ed7"} Apr 22 15:34:12.212460 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.212441 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:34:12.230282 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.230247 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wkqg5" podStartSLOduration=4.005066513 podStartE2EDuration="21.230237185s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:33:53.708879492 +0000 UTC m=+3.178430274" lastFinishedPulling="2026-04-22 15:34:10.934050156 +0000 UTC m=+20.403600946" observedRunningTime="2026-04-22 15:34:12.217647389 +0000 UTC m=+21.687198189" watchObservedRunningTime="2026-04-22 15:34:12.230237185 +0000 UTC m=+21.699788017" Apr 22 15:34:12.246291 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:12.246260 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vqw4p" podStartSLOduration=4.016425528 podStartE2EDuration="21.246252654s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:33:53.705280779 +0000 UTC m=+3.174831555" lastFinishedPulling="2026-04-22 15:34:10.935107904 +0000 UTC m=+20.404658681" observedRunningTime="2026-04-22 15:34:12.229988897 +0000 UTC m=+21.699539696" watchObservedRunningTime="2026-04-22 15:34:12.246252654 +0000 UTC m=+21.715803452" Apr 22 15:34:13.030301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.030057 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:34:12.212456235Z","UUID":"92b35a14-eb5e-4b80-8a9f-4589b7c12a7e","Handler":null,"Name":"","Endpoint":""} Apr 22 15:34:13.031848 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.031824 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:34:13.032000 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.031855 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:34:13.082093 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.082066 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:13.082249 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:13.082224 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:13.082317 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.082288 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:13.082435 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:13.082412 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:13.212256 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.211824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wd2xg" event={"ID":"01fd759b-aafb-49c6-a60e-5424150b1157","Type":"ContainerStarted","Data":"5d5a16a0b724feed0d43aeae061b251efc1587be2a89cc976b0180a9d1ea0a43"} Apr 22 15:34:13.213763 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.213722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" event={"ID":"f7ae90cc-3bb3-4d83-8b18-0478053cfa90","Type":"ContainerStarted","Data":"9015720b7fbcfa2661a61308f98727300ac2f4e4806ac14b42fb5c5ae5ec0551"} Apr 22 15:34:13.226966 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.226878 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wd2xg" podStartSLOduration=4.995486519 podStartE2EDuration="22.226862632s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:33:53.703342007 +0000 UTC m=+3.172892784" lastFinishedPulling="2026-04-22 15:34:10.934718106 +0000 UTC m=+20.404268897" observedRunningTime="2026-04-22 15:34:13.226456397 +0000 UTC m=+22.696007197" watchObservedRunningTime="2026-04-22 15:34:13.226862632 +0000 UTC m=+22.696413431" Apr 22 15:34:13.369057 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.368989 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:34:13.369641 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:13.369617 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:34:14.218854 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:14.218452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" event={"ID":"bf7bc51e-13ec-42f9-912c-b51cd7134006","Type":"ContainerStarted","Data":"3a8e3ba5c37650794ebefc87afc5f30ebbf4306ebd0a0361dd9b9fc5f51958bd"} Apr 22 15:34:14.220582 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:14.220548 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" event={"ID":"f7ae90cc-3bb3-4d83-8b18-0478053cfa90","Type":"ContainerStarted","Data":"c3e7dd5b47ac16fe0b6adfe40ca9d61ed5945905d137422e26c84e2ccb708e1d"} Apr 22 15:34:15.081399 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:15.081365 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:15.081598 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:15.081486 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:15.081598 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:15.081504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:15.081722 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:15.081613 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:17.081038 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.080838 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:17.081926 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.080844 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:17.081926 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:17.081113 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:17.081926 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:17.081181 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:17.226506 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.226477 2577 generic.go:358] "Generic (PLEG): container finished" podID="2b6e7ecf-d54b-4238-8dd4-a8502eb2627e" containerID="77a1676717e1792a26aa7073d03e07ed09ab51cf4e4d34c9ea9ddd25d27fee91" exitCode=0 Apr 22 15:34:17.226624 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.226557 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" event={"ID":"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e","Type":"ContainerDied","Data":"77a1676717e1792a26aa7073d03e07ed09ab51cf4e4d34c9ea9ddd25d27fee91"} Apr 22 15:34:17.229834 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.229812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" event={"ID":"bf7bc51e-13ec-42f9-912c-b51cd7134006","Type":"ContainerStarted","Data":"acfd2611ebca81162d2d6b5d9564633954cfd040b92099e383af6cb2cc4bbb4e"} Apr 22 15:34:17.230118 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.230101 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:34:17.230118 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.230124 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:34:17.230118 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.230137 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:34:17.244523 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.244507 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:34:17.244864 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.244851 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:34:17.252406 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.252370 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x7pkz" podStartSLOduration=6.491489054 podStartE2EDuration="26.25236115s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:33:53.69934424 +0000 UTC m=+3.168895020" lastFinishedPulling="2026-04-22 15:34:13.460216325 +0000 UTC m=+22.929767116" observedRunningTime="2026-04-22 15:34:14.243771414 +0000 UTC m=+23.713322212" watchObservedRunningTime="2026-04-22 15:34:17.25236115 +0000 UTC m=+26.721911948" Apr 22 15:34:17.288486 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:17.288451 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" podStartSLOduration=9.008208235 podStartE2EDuration="26.28844137s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:33:53.706608422 +0000 UTC m=+3.176159199" lastFinishedPulling="2026-04-22 15:34:10.986841543 +0000 UTC m=+20.456392334" observedRunningTime="2026-04-22 15:34:17.288428743 +0000 UTC m=+26.757979563" watchObservedRunningTime="2026-04-22 15:34:17.28844137 +0000 UTC m=+26.757992464" Apr 22 15:34:18.233681 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:18.233641 2577 generic.go:358] "Generic (PLEG): container finished" podID="2b6e7ecf-d54b-4238-8dd4-a8502eb2627e" containerID="3a76e2f9ed27966246eee2ce1f9c749e6c9dccd3f47bf54787499f3195b95ef0" exitCode=0 Apr 22 15:34:18.234150 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:18.233704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" event={"ID":"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e","Type":"ContainerDied","Data":"3a76e2f9ed27966246eee2ce1f9c749e6c9dccd3f47bf54787499f3195b95ef0"} Apr 22 15:34:18.321956 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:18.321922 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zzcmv"] Apr 22 15:34:18.322141 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:18.322084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:18.322224 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:18.322200 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:18.322643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:18.322619 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zhpd9"] Apr 22 15:34:18.322727 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:18.322705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:18.322816 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:18.322795 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:19.238016 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:19.237684 2577 generic.go:358] "Generic (PLEG): container finished" podID="2b6e7ecf-d54b-4238-8dd4-a8502eb2627e" containerID="eb8f49f8a01f916b7c08950eb3e1f763d26f7697bce45dc7e574b7f3ba729024" exitCode=0 Apr 22 15:34:19.238016 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:19.237763 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" event={"ID":"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e","Type":"ContainerDied","Data":"eb8f49f8a01f916b7c08950eb3e1f763d26f7697bce45dc7e574b7f3ba729024"} Apr 22 15:34:20.080370 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:20.080340 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:20.080370 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:20.080362 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:20.080559 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:20.080462 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:20.080657 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:20.080624 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:21.453959 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:21.453921 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:34:21.454559 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:21.454079 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 15:34:21.454627 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:21.454558 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ljlbr" Apr 22 15:34:22.080530 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.080498 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:22.080715 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.080504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:22.080715 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:22.080618 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zhpd9" podUID="f537b0eb-8087-4572-b237-83ff59e51f13" Apr 22 15:34:22.080798 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:22.080722 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:34:22.675421 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.675392 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gkzsd"] Apr 22 15:34:22.710222 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.710194 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gkzsd"] Apr 22 15:34:22.710350 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.710293 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:22.710388 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:22.710347 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkzsd" podUID="b89c0136-9aaf-4dfa-9c2c-f576dcf334b7" Apr 22 15:34:22.813638 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.813602 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-kubelet-config\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:22.813771 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.813675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-dbus\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:22.813771 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.813705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-original-pull-secret\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:22.914656 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.914624 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-kubelet-config\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:22.914798 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.914696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-dbus\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:22.914798 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.914739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-original-pull-secret\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:22.914798 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.914753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-kubelet-config\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:22.914969 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:22.914872 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:34:22.914969 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:22.914960 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-original-pull-secret podName:b89c0136-9aaf-4dfa-9c2c-f576dcf334b7 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:23.414938352 +0000 UTC m=+32.884489140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-original-pull-secret") pod "global-pull-secret-syncer-gkzsd" (UID: "b89c0136-9aaf-4dfa-9c2c-f576dcf334b7") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:34:22.914969 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:22.914957 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-dbus\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:23.418228 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.418190 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-original-pull-secret\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:23.418391 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:23.418296 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:34:23.418391 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:23.418355 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-original-pull-secret podName:b89c0136-9aaf-4dfa-9c2c-f576dcf334b7 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.418343034 +0000 UTC m=+33.887893811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-original-pull-secret") pod "global-pull-secret-syncer-gkzsd" (UID: "b89c0136-9aaf-4dfa-9c2c-f576dcf334b7") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:34:23.831304 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.831277 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-30.ec2.internal" event="NodeReady" Apr 22 15:34:23.831765 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.831403 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:34:23.883891 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.883858 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-swhzp"] Apr 22 15:34:23.885827 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.885810 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:23.887558 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.887536 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8njnj"] Apr 22 15:34:23.888550 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.888529 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:34:23.888639 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.888558 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xgzsv\"" Apr 22 15:34:23.888639 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.888538 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:34:23.889226 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.889211 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:23.891359 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.891339 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:34:23.891458 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.891393 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fpsm4\"" Apr 22 15:34:23.891740 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.891721 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:34:23.891826 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.891766 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:34:23.906499 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.906481 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8njnj"] Apr 22 15:34:23.909841 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:23.909823 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-swhzp"] Apr 22 15:34:24.022838 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.022810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74e2d982-4c5d-4d80-8ce4-b86491c0f765-config-volume\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.023029 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.022851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8c95\" (UniqueName: \"kubernetes.io/projected/74e2d982-4c5d-4d80-8ce4-b86491c0f765-kube-api-access-f8c95\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.023029 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.022883 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:24.023029 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.022980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74e2d982-4c5d-4d80-8ce4-b86491c0f765-tmp-dir\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.023172 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.023091 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.023172 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.023148 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twrcs\" (UniqueName: \"kubernetes.io/projected/a1de4878-d335-41a9-a163-5332f8e575d6-kube-api-access-twrcs\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:24.080608 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.080580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:24.080608 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.080598 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:24.080803 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.080598 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:24.083234 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.083170 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:34:24.083234 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.083198 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:34:24.083234 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.083207 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:34:24.083445 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.083273 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxbwk\"" Apr 22 15:34:24.083486 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.083460 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xnsfv\"" Apr 22 15:34:24.083550 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.083531 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:34:24.123454 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.123434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74e2d982-4c5d-4d80-8ce4-b86491c0f765-config-volume\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.123548 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.123464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8c95\" (UniqueName: \"kubernetes.io/projected/74e2d982-4c5d-4d80-8ce4-b86491c0f765-kube-api-access-f8c95\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.123548 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.123491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:24.123548 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.123528 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74e2d982-4c5d-4d80-8ce4-b86491c0f765-tmp-dir\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.123646 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.123604 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:24.123677 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.123652 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert podName:a1de4878-d335-41a9-a163-5332f8e575d6 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.62363813 +0000 UTC m=+34.093188910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert") pod "ingress-canary-8njnj" (UID: "a1de4878-d335-41a9-a163-5332f8e575d6") : secret "canary-serving-cert" not found Apr 22 15:34:24.123732 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.123710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.123782 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.123768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twrcs\" (UniqueName: \"kubernetes.io/projected/a1de4878-d335-41a9-a163-5332f8e575d6-kube-api-access-twrcs\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:24.123886 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.123868 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:24.123997 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.123869 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/74e2d982-4c5d-4d80-8ce4-b86491c0f765-tmp-dir\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.124042 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.123990 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls podName:74e2d982-4c5d-4d80-8ce4-b86491c0f765 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.623975378 +0000 UTC m=+34.093526162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls") pod "dns-default-swhzp" (UID: "74e2d982-4c5d-4d80-8ce4-b86491c0f765") : secret "dns-default-metrics-tls" not found Apr 22 15:34:24.124086 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.124062 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74e2d982-4c5d-4d80-8ce4-b86491c0f765-config-volume\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.136607 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.136588 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8c95\" (UniqueName: \"kubernetes.io/projected/74e2d982-4c5d-4d80-8ce4-b86491c0f765-kube-api-access-f8c95\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.136701 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.136683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twrcs\" (UniqueName: \"kubernetes.io/projected/a1de4878-d335-41a9-a163-5332f8e575d6-kube-api-access-twrcs\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:24.426673 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.426590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-original-pull-secret\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:24.429274 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.429251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b89c0136-9aaf-4dfa-9c2c-f576dcf334b7-original-pull-secret\") pod \"global-pull-secret-syncer-gkzsd\" (UID: \"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7\") " pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:24.628815 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.628776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:24.629020 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.628839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:24.629020 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.628962 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:24.629020 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.628976 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:24.629179 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.629038 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert podName:a1de4878-d335-41a9-a163-5332f8e575d6 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:25.629017685 +0000 UTC m=+35.098568478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert") pod "ingress-canary-8njnj" (UID: "a1de4878-d335-41a9-a163-5332f8e575d6") : secret "canary-serving-cert" not found Apr 22 15:34:24.629179 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.629057 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls podName:74e2d982-4c5d-4d80-8ce4-b86491c0f765 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:25.629048037 +0000 UTC m=+35.098598815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls") pod "dns-default-swhzp" (UID: "74e2d982-4c5d-4d80-8ce4-b86491c0f765") : secret "dns-default-metrics-tls" not found Apr 22 15:34:24.690848 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.690768 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkzsd" Apr 22 15:34:24.730467 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.730052 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:24.730467 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.730205 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:34:24.730467 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:24.730267 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs podName:f7958036-9067-47bb-91dc-dc565feb289a nodeName:}" failed. No retries permitted until 2026-04-22 15:34:56.730248475 +0000 UTC m=+66.199799276 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs") pod "network-metrics-daemon-zzcmv" (UID: "f7958036-9067-47bb-91dc-dc565feb289a") : secret "metrics-daemon-secret" not found Apr 22 15:34:24.831623 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.831429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxlv6\" (UniqueName: \"kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6\") pod \"network-check-target-zhpd9\" (UID: \"f537b0eb-8087-4572-b237-83ff59e51f13\") " pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:24.835117 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.835092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxlv6\" (UniqueName: \"kubernetes.io/projected/f537b0eb-8087-4572-b237-83ff59e51f13-kube-api-access-pxlv6\") pod \"network-check-target-zhpd9\" (UID: \"f537b0eb-8087-4572-b237-83ff59e51f13\") " pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:24.845299 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.845241 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gkzsd"] Apr 22 15:34:24.849308 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:34:24.849282 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb89c0136_9aaf_4dfa_9c2c_f576dcf334b7.slice/crio-383f31735917c388f05ecc4c217d20661bd01a6bcdb22bad222a74f05d608711 WatchSource:0}: Error finding container 383f31735917c388f05ecc4c217d20661bd01a6bcdb22bad222a74f05d608711: Status 404 returned error can't find the container with id 383f31735917c388f05ecc4c217d20661bd01a6bcdb22bad222a74f05d608711 Apr 22 15:34:24.999787 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:24.999758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:25.118369 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:25.118336 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zhpd9"] Apr 22 15:34:25.123204 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:34:25.123179 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf537b0eb_8087_4572_b237_83ff59e51f13.slice/crio-2f8c9f7c77acb4c2b3a3c2015f9fccec3efd0a9e46cee34fdd046f43dc04b93f WatchSource:0}: Error finding container 2f8c9f7c77acb4c2b3a3c2015f9fccec3efd0a9e46cee34fdd046f43dc04b93f: Status 404 returned error can't find the container with id 2f8c9f7c77acb4c2b3a3c2015f9fccec3efd0a9e46cee34fdd046f43dc04b93f Apr 22 15:34:25.252021 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:25.251937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zhpd9" event={"ID":"f537b0eb-8087-4572-b237-83ff59e51f13","Type":"ContainerStarted","Data":"2f8c9f7c77acb4c2b3a3c2015f9fccec3efd0a9e46cee34fdd046f43dc04b93f"} Apr 22 15:34:25.252873 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:25.252840 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gkzsd" event={"ID":"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7","Type":"ContainerStarted","Data":"383f31735917c388f05ecc4c217d20661bd01a6bcdb22bad222a74f05d608711"} Apr 22 15:34:25.637806 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:25.637577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:25.637806 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:25.637635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:25.637806 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:25.637759 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:25.637806 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:25.637762 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:25.638102 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:25.637825 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls podName:74e2d982-4c5d-4d80-8ce4-b86491c0f765 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:27.637804577 +0000 UTC m=+37.107355374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls") pod "dns-default-swhzp" (UID: "74e2d982-4c5d-4d80-8ce4-b86491c0f765") : secret "dns-default-metrics-tls" not found Apr 22 15:34:25.638102 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:25.637845 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert podName:a1de4878-d335-41a9-a163-5332f8e575d6 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:27.637835311 +0000 UTC m=+37.107386095 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert") pod "ingress-canary-8njnj" (UID: "a1de4878-d335-41a9-a163-5332f8e575d6") : secret "canary-serving-cert" not found Apr 22 15:34:27.652499 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:27.652451 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:27.652990 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:27.652558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:27.652990 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:27.652681 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:27.652990 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:27.652729 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert podName:a1de4878-d335-41a9-a163-5332f8e575d6 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:31.652715641 +0000 UTC m=+41.122266418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert") pod "ingress-canary-8njnj" (UID: "a1de4878-d335-41a9-a163-5332f8e575d6") : secret "canary-serving-cert" not found Apr 22 15:34:27.653176 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:27.653111 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:27.653176 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:27.653157 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls podName:74e2d982-4c5d-4d80-8ce4-b86491c0f765 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:31.65314228 +0000 UTC m=+41.122693059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls") pod "dns-default-swhzp" (UID: "74e2d982-4c5d-4d80-8ce4-b86491c0f765") : secret "dns-default-metrics-tls" not found Apr 22 15:34:31.683815 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:31.683779 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:31.684256 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:31.683857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:31.684256 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:31.683954 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:31.684256 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:31.684009 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:31.684256 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:31.684035 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert podName:a1de4878-d335-41a9-a163-5332f8e575d6 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:39.684013422 +0000 UTC m=+49.153564200 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert") pod "ingress-canary-8njnj" (UID: "a1de4878-d335-41a9-a163-5332f8e575d6") : secret "canary-serving-cert" not found Apr 22 15:34:31.684256 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:31.684066 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls podName:74e2d982-4c5d-4d80-8ce4-b86491c0f765 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:39.684049838 +0000 UTC m=+49.153600632 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls") pod "dns-default-swhzp" (UID: "74e2d982-4c5d-4d80-8ce4-b86491c0f765") : secret "dns-default-metrics-tls" not found Apr 22 15:34:32.267642 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:32.267611 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zhpd9" event={"ID":"f537b0eb-8087-4572-b237-83ff59e51f13","Type":"ContainerStarted","Data":"e192361adb234b65b80be9aefb193fbadab285d02d5f92d3b9355b94c6eaca99"} Apr 22 15:34:32.267825 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:32.267726 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:34:32.269832 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:32.269809 2577 generic.go:358] "Generic (PLEG): container finished" podID="2b6e7ecf-d54b-4238-8dd4-a8502eb2627e" containerID="7340f238f9e8eefdce33cdbc2ec2cd53bf2553e020b987d0bdff9b4136288cd6" exitCode=0 Apr 22 15:34:32.269944 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:32.269864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" event={"ID":"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e","Type":"ContainerDied","Data":"7340f238f9e8eefdce33cdbc2ec2cd53bf2553e020b987d0bdff9b4136288cd6"} Apr 22 15:34:32.271249 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:32.271228 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gkzsd" event={"ID":"b89c0136-9aaf-4dfa-9c2c-f576dcf334b7","Type":"ContainerStarted","Data":"8bb0f8dba959700a9c0709bbde1d46a26dc0448f21b5134a8e7b31f665afe84e"} Apr 22 15:34:32.281965 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:32.281893 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zhpd9" podStartSLOduration=35.011302729 podStartE2EDuration="41.281879047s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:34:25.125131134 +0000 UTC m=+34.594681911" lastFinishedPulling="2026-04-22 15:34:31.395707452 +0000 UTC m=+40.865258229" observedRunningTime="2026-04-22 15:34:32.281778474 +0000 UTC m=+41.751329272" watchObservedRunningTime="2026-04-22 15:34:32.281879047 +0000 UTC m=+41.751429859" Apr 22 15:34:32.300198 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:32.300165 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gkzsd" podStartSLOduration=3.745904059 podStartE2EDuration="10.30015457s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:24.851238242 +0000 UTC m=+34.320789034" lastFinishedPulling="2026-04-22 15:34:31.405488754 +0000 UTC m=+40.875039545" observedRunningTime="2026-04-22 15:34:32.299598002 +0000 UTC m=+41.769148800" watchObservedRunningTime="2026-04-22 15:34:32.30015457 +0000 UTC m=+41.769705369" Apr 22 15:34:33.275406 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:33.275373 2577 generic.go:358] "Generic (PLEG): container finished" podID="2b6e7ecf-d54b-4238-8dd4-a8502eb2627e" containerID="7718a2fa0d6f78f87612263884da8d81a239ccd5c2c28e8adb8b314df9c7a4e6" exitCode=0 Apr 22 15:34:33.275785 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:33.275487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" event={"ID":"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e","Type":"ContainerDied","Data":"7718a2fa0d6f78f87612263884da8d81a239ccd5c2c28e8adb8b314df9c7a4e6"} Apr 22 15:34:34.280113 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:34.280077 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" event={"ID":"2b6e7ecf-d54b-4238-8dd4-a8502eb2627e","Type":"ContainerStarted","Data":"0847d4f1ed8175aaf3c4dbeddfb13c9903d1f612949db577b5d719354cf108ca"} Apr 22 15:34:34.303428 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:34.303382 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x6qrc" podStartSLOduration=5.610570444 podStartE2EDuration="43.303367594s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:33:53.702844503 +0000 UTC m=+3.172395280" lastFinishedPulling="2026-04-22 15:34:31.395641653 +0000 UTC m=+40.865192430" observedRunningTime="2026-04-22 15:34:34.302629253 +0000 UTC m=+43.772180051" watchObservedRunningTime="2026-04-22 15:34:34.303367594 +0000 UTC m=+43.772918392" Apr 22 15:34:39.737273 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:39.737234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:39.737630 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:39.737298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:39.737630 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:39.737381 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:39.737630 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:39.737438 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert podName:a1de4878-d335-41a9-a163-5332f8e575d6 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:55.73742402 +0000 UTC m=+65.206974797 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert") pod "ingress-canary-8njnj" (UID: "a1de4878-d335-41a9-a163-5332f8e575d6") : secret "canary-serving-cert" not found Apr 22 15:34:39.737630 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:39.737380 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:39.737630 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:39.737469 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls podName:74e2d982-4c5d-4d80-8ce4-b86491c0f765 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:55.737462938 +0000 UTC m=+65.207013715 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls") pod "dns-default-swhzp" (UID: "74e2d982-4c5d-4d80-8ce4-b86491c0f765") : secret "dns-default-metrics-tls" not found Apr 22 15:34:49.278369 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:49.278332 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4qpdw" Apr 22 15:34:55.744758 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:55.744694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:34:55.744758 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:55.744767 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:34:55.745285 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:55.744858 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:55.745285 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:55.744883 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:55.745285 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:55.744948 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert podName:a1de4878-d335-41a9-a163-5332f8e575d6 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:27.744931488 +0000 UTC m=+97.214482268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert") pod "ingress-canary-8njnj" (UID: "a1de4878-d335-41a9-a163-5332f8e575d6") : secret "canary-serving-cert" not found Apr 22 15:34:55.745285 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:55.744965 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls podName:74e2d982-4c5d-4d80-8ce4-b86491c0f765 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:27.744958496 +0000 UTC m=+97.214509277 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls") pod "dns-default-swhzp" (UID: "74e2d982-4c5d-4d80-8ce4-b86491c0f765") : secret "dns-default-metrics-tls" not found Apr 22 15:34:56.751250 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:34:56.751212 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:34:56.751649 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:56.751371 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:34:56.751649 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:34:56.751441 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs podName:f7958036-9067-47bb-91dc-dc565feb289a nodeName:}" failed. No retries permitted until 2026-04-22 15:36:00.751423665 +0000 UTC m=+130.220974442 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs") pod "network-metrics-daemon-zzcmv" (UID: "f7958036-9067-47bb-91dc-dc565feb289a") : secret "metrics-daemon-secret" not found Apr 22 15:35:03.278645 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:03.278610 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zhpd9" Apr 22 15:35:27.763884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:27.763842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:35:27.764383 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:27.763915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:35:27.764383 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:35:27.764012 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:35:27.764383 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:35:27.764088 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert podName:a1de4878-d335-41a9-a163-5332f8e575d6 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:31.764074487 +0000 UTC m=+161.233625264 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert") pod "ingress-canary-8njnj" (UID: "a1de4878-d335-41a9-a163-5332f8e575d6") : secret "canary-serving-cert" not found Apr 22 15:35:27.764383 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:35:27.764014 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:35:27.764383 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:35:27.764158 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls podName:74e2d982-4c5d-4d80-8ce4-b86491c0f765 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:31.764147444 +0000 UTC m=+161.233698221 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls") pod "dns-default-swhzp" (UID: "74e2d982-4c5d-4d80-8ce4-b86491c0f765") : secret "dns-default-metrics-tls" not found Apr 22 15:35:59.616090 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.616057 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49"] Apr 22 15:35:59.618046 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.618018 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49" Apr 22 15:35:59.621186 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.621161 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 15:35:59.621888 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.621873 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:35:59.621968 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.621892 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-7nkts\"" Apr 22 15:35:59.626446 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.626430 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49"] Apr 22 15:35:59.668703 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.668676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv464\" (UniqueName: \"kubernetes.io/projected/015ea955-83cf-4c04-8d1e-238e49a24f54-kube-api-access-qv464\") pod \"volume-data-source-validator-7c6cbb6c87-pln49\" (UID: \"015ea955-83cf-4c04-8d1e-238e49a24f54\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49" Apr 22 15:35:59.769520 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.769497 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qv464\" (UniqueName: \"kubernetes.io/projected/015ea955-83cf-4c04-8d1e-238e49a24f54-kube-api-access-qv464\") pod \"volume-data-source-validator-7c6cbb6c87-pln49\" (UID: \"015ea955-83cf-4c04-8d1e-238e49a24f54\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49" Apr 22 15:35:59.780639 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.780608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv464\" (UniqueName: \"kubernetes.io/projected/015ea955-83cf-4c04-8d1e-238e49a24f54-kube-api-access-qv464\") pod \"volume-data-source-validator-7c6cbb6c87-pln49\" (UID: \"015ea955-83cf-4c04-8d1e-238e49a24f54\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49" Apr 22 15:35:59.927140 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:35:59.927095 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49" Apr 22 15:36:00.036841 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:00.036816 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49"] Apr 22 15:36:00.040150 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:00.040123 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod015ea955_83cf_4c04_8d1e_238e49a24f54.slice/crio-1bad5ed516597cc0e603d774d2130c26f821205a3f4305897bba0481a274ac64 WatchSource:0}: Error finding container 1bad5ed516597cc0e603d774d2130c26f821205a3f4305897bba0481a274ac64: Status 404 returned error can't find the container with id 1bad5ed516597cc0e603d774d2130c26f821205a3f4305897bba0481a274ac64 Apr 22 15:36:00.438382 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:00.438355 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49" event={"ID":"015ea955-83cf-4c04-8d1e-238e49a24f54","Type":"ContainerStarted","Data":"1bad5ed516597cc0e603d774d2130c26f821205a3f4305897bba0481a274ac64"} Apr 22 15:36:00.777282 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:00.777256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:36:00.777561 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:00.777360 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:36:00.777561 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:00.777410 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs podName:f7958036-9067-47bb-91dc-dc565feb289a nodeName:}" failed. No retries permitted until 2026-04-22 15:38:02.777394481 +0000 UTC m=+252.246945258 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs") pod "network-metrics-daemon-zzcmv" (UID: "f7958036-9067-47bb-91dc-dc565feb289a") : secret "metrics-daemon-secret" not found Apr 22 15:36:01.225864 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:01.225790 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vqw4p_45497d38-5523-4037-9d5e-b2d5cf55efc2/dns-node-resolver/0.log" Apr 22 15:36:02.226393 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:02.226312 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wkqg5_0acda737-f5c9-4897-bd5e-94296fc02284/node-ca/0.log" Apr 22 15:36:02.444393 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:02.444257 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49" event={"ID":"015ea955-83cf-4c04-8d1e-238e49a24f54","Type":"ContainerStarted","Data":"12b1d3b2f73629eeb9649d672c32e62776e1224980a32932ae1d3d61b3d78d62"} Apr 22 15:36:02.458704 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:02.458664 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pln49" podStartSLOduration=1.62166817 podStartE2EDuration="3.458651673s" podCreationTimestamp="2026-04-22 15:35:59 +0000 UTC" firstStartedPulling="2026-04-22 15:36:00.041776658 +0000 UTC m=+129.511327434" lastFinishedPulling="2026-04-22 15:36:01.878760158 +0000 UTC m=+131.348310937" observedRunningTime="2026-04-22 15:36:02.457657182 +0000 UTC m=+131.927207992" watchObservedRunningTime="2026-04-22 15:36:02.458651673 +0000 UTC m=+131.928202489" Apr 22 15:36:04.726911 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.726863 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj"] Apr 22 15:36:04.728781 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.728767 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:04.732427 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.732388 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 15:36:04.732575 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.732459 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:36:04.732575 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.732465 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 15:36:04.733389 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.733370 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-n7xns\"" Apr 22 15:36:04.733488 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.733374 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 15:36:04.740728 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.740711 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj"] Apr 22 15:36:04.803946 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.803922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z98pg\" (UniqueName: \"kubernetes.io/projected/31ebe0d7-a716-4a01-85dd-f78472a7c41b-kube-api-access-z98pg\") pod \"service-ca-operator-d6fc45fc5-lhbsj\" (UID: \"31ebe0d7-a716-4a01-85dd-f78472a7c41b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:04.804034 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.803965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31ebe0d7-a716-4a01-85dd-f78472a7c41b-config\") pod \"service-ca-operator-d6fc45fc5-lhbsj\" (UID: \"31ebe0d7-a716-4a01-85dd-f78472a7c41b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:04.804034 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.803988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31ebe0d7-a716-4a01-85dd-f78472a7c41b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lhbsj\" (UID: \"31ebe0d7-a716-4a01-85dd-f78472a7c41b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:04.904275 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.904251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z98pg\" (UniqueName: \"kubernetes.io/projected/31ebe0d7-a716-4a01-85dd-f78472a7c41b-kube-api-access-z98pg\") pod \"service-ca-operator-d6fc45fc5-lhbsj\" (UID: \"31ebe0d7-a716-4a01-85dd-f78472a7c41b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:04.904332 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.904291 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31ebe0d7-a716-4a01-85dd-f78472a7c41b-config\") pod \"service-ca-operator-d6fc45fc5-lhbsj\" (UID: \"31ebe0d7-a716-4a01-85dd-f78472a7c41b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:04.904332 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.904316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31ebe0d7-a716-4a01-85dd-f78472a7c41b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lhbsj\" (UID: \"31ebe0d7-a716-4a01-85dd-f78472a7c41b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:04.904776 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.904757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31ebe0d7-a716-4a01-85dd-f78472a7c41b-config\") pod \"service-ca-operator-d6fc45fc5-lhbsj\" (UID: \"31ebe0d7-a716-4a01-85dd-f78472a7c41b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:04.906592 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.906573 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31ebe0d7-a716-4a01-85dd-f78472a7c41b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lhbsj\" (UID: \"31ebe0d7-a716-4a01-85dd-f78472a7c41b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:04.914289 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:04.914266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z98pg\" (UniqueName: \"kubernetes.io/projected/31ebe0d7-a716-4a01-85dd-f78472a7c41b-kube-api-access-z98pg\") pod \"service-ca-operator-d6fc45fc5-lhbsj\" (UID: \"31ebe0d7-a716-4a01-85dd-f78472a7c41b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:05.036907 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:05.036876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" Apr 22 15:36:05.153535 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:05.153510 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj"] Apr 22 15:36:05.156022 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:05.155997 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31ebe0d7_a716_4a01_85dd_f78472a7c41b.slice/crio-8bda2128acbde9ad29609c1135e6e17233a89e4205b0d31cffd175431115ce02 WatchSource:0}: Error finding container 8bda2128acbde9ad29609c1135e6e17233a89e4205b0d31cffd175431115ce02: Status 404 returned error can't find the container with id 8bda2128acbde9ad29609c1135e6e17233a89e4205b0d31cffd175431115ce02 Apr 22 15:36:05.454644 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:05.454583 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" event={"ID":"31ebe0d7-a716-4a01-85dd-f78472a7c41b","Type":"ContainerStarted","Data":"8bda2128acbde9ad29609c1135e6e17233a89e4205b0d31cffd175431115ce02"} Apr 22 15:36:06.283603 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.283556 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-77848d7596-wp72d"] Apr 22 15:36:06.286604 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.286580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.289341 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.289318 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 15:36:06.289483 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.289457 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 15:36:06.289560 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.289544 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-92frt\"" Apr 22 15:36:06.290600 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.290579 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 15:36:06.296254 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.296234 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 15:36:06.302046 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.302014 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77848d7596-wp72d"] Apr 22 15:36:06.414482 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.414452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b4ls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-kube-api-access-9b4ls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.414633 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.414505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-trusted-ca\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.414633 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.414579 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-image-registry-private-configuration\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.414633 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.414615 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54f56349-cf1e-4efa-b6f2-ff228d7570a3-ca-trust-extracted\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.414784 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.414662 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-bound-sa-token\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.414784 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.414686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.414784 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.414709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-certificates\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.414784 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.414728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-installation-pull-secrets\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.515382 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.515354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9b4ls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-kube-api-access-9b4ls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.515382 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.515386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-trusted-ca\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.515559 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.515419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-image-registry-private-configuration\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.515559 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.515444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54f56349-cf1e-4efa-b6f2-ff228d7570a3-ca-trust-extracted\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.515559 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.515489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-bound-sa-token\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.515559 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.515524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.515559 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.515546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-certificates\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.515758 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.515571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-installation-pull-secrets\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.515758 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:06.515647 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:36:06.515758 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:06.515669 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77848d7596-wp72d: secret "image-registry-tls" not found Apr 22 15:36:06.515758 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:06.515740 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls podName:54f56349-cf1e-4efa-b6f2-ff228d7570a3 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:07.015722989 +0000 UTC m=+136.485273786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls") pod "image-registry-77848d7596-wp72d" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3") : secret "image-registry-tls" not found Apr 22 15:36:06.516173 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.516153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54f56349-cf1e-4efa-b6f2-ff228d7570a3-ca-trust-extracted\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.516292 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.516271 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-certificates\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.516466 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.516450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-trusted-ca\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.518024 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.518002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-image-registry-private-configuration\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.518137 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.518121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-installation-pull-secrets\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.525578 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.525557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-bound-sa-token\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:06.525843 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:06.525824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b4ls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-kube-api-access-9b4ls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:07.018820 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:07.018777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:07.018996 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:07.018936 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:36:07.018996 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:07.018952 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77848d7596-wp72d: secret "image-registry-tls" not found Apr 22 15:36:07.019083 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:07.019008 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls podName:54f56349-cf1e-4efa-b6f2-ff228d7570a3 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:08.018993678 +0000 UTC m=+137.488544471 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls") pod "image-registry-77848d7596-wp72d" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3") : secret "image-registry-tls" not found Apr 22 15:36:08.026839 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:08.026793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:08.027226 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:08.026960 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:36:08.027226 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:08.026981 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77848d7596-wp72d: secret "image-registry-tls" not found Apr 22 15:36:08.027226 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:08.027041 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls podName:54f56349-cf1e-4efa-b6f2-ff228d7570a3 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:10.027023539 +0000 UTC m=+139.496574324 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls") pod "image-registry-77848d7596-wp72d" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3") : secret "image-registry-tls" not found Apr 22 15:36:08.461717 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:08.461636 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" event={"ID":"31ebe0d7-a716-4a01-85dd-f78472a7c41b","Type":"ContainerStarted","Data":"428b3ae6a295f7f06a8e0f9639cf4dea5186bd6597bce28f1befee889be6ac74"} Apr 22 15:36:08.478741 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:08.478669 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" podStartSLOduration=2.178816857 podStartE2EDuration="4.478654222s" podCreationTimestamp="2026-04-22 15:36:04 +0000 UTC" firstStartedPulling="2026-04-22 15:36:05.157601393 +0000 UTC m=+134.627152173" lastFinishedPulling="2026-04-22 15:36:07.457438761 +0000 UTC m=+136.926989538" observedRunningTime="2026-04-22 15:36:08.478224788 +0000 UTC m=+137.947775586" watchObservedRunningTime="2026-04-22 15:36:08.478654222 +0000 UTC m=+137.948205022" Apr 22 15:36:10.042302 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:10.042272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:10.042678 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:10.042430 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:36:10.042678 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:10.042449 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77848d7596-wp72d: secret "image-registry-tls" not found Apr 22 15:36:10.042678 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:10.042509 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls podName:54f56349-cf1e-4efa-b6f2-ff228d7570a3 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:14.042490865 +0000 UTC m=+143.512041643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls") pod "image-registry-77848d7596-wp72d" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3") : secret "image-registry-tls" not found Apr 22 15:36:14.071277 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:14.071231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:14.071657 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:14.071343 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:36:14.071657 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:14.071356 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-77848d7596-wp72d: secret "image-registry-tls" not found Apr 22 15:36:14.071657 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:14.071415 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls podName:54f56349-cf1e-4efa-b6f2-ff228d7570a3 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:22.071400719 +0000 UTC m=+151.540951497 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls") pod "image-registry-77848d7596-wp72d" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3") : secret "image-registry-tls" not found Apr 22 15:36:22.129737 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:22.129698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:22.132141 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:22.132109 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") pod \"image-registry-77848d7596-wp72d\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:22.197135 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:22.197107 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:22.322877 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:22.322846 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-77848d7596-wp72d"] Apr 22 15:36:22.328714 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:22.328686 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54f56349_cf1e_4efa_b6f2_ff228d7570a3.slice/crio-35205faf185bd5ba5ac45dc391bbc18a7179d479e2d53ea3003a628190e20218 WatchSource:0}: Error finding container 35205faf185bd5ba5ac45dc391bbc18a7179d479e2d53ea3003a628190e20218: Status 404 returned error can't find the container with id 35205faf185bd5ba5ac45dc391bbc18a7179d479e2d53ea3003a628190e20218 Apr 22 15:36:22.489607 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:22.489573 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77848d7596-wp72d" event={"ID":"54f56349-cf1e-4efa-b6f2-ff228d7570a3","Type":"ContainerStarted","Data":"4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404"} Apr 22 15:36:22.489607 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:22.489612 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77848d7596-wp72d" event={"ID":"54f56349-cf1e-4efa-b6f2-ff228d7570a3","Type":"ContainerStarted","Data":"35205faf185bd5ba5ac45dc391bbc18a7179d479e2d53ea3003a628190e20218"} Apr 22 15:36:22.489833 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:22.489703 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:22.531195 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:22.531151 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-77848d7596-wp72d" podStartSLOduration=16.531136701 podStartE2EDuration="16.531136701s" podCreationTimestamp="2026-04-22 15:36:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:36:22.529518166 +0000 UTC m=+151.999068964" watchObservedRunningTime="2026-04-22 15:36:22.531136701 +0000 UTC m=+152.000687500" Apr 22 15:36:26.896465 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:26.896418 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-swhzp" podUID="74e2d982-4c5d-4d80-8ce4-b86491c0f765" Apr 22 15:36:26.900587 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:26.900561 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8njnj" podUID="a1de4878-d335-41a9-a163-5332f8e575d6" Apr 22 15:36:27.106526 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:27.106491 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zzcmv" podUID="f7958036-9067-47bb-91dc-dc565feb289a" Apr 22 15:36:27.500359 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:27.500333 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-swhzp" Apr 22 15:36:27.500359 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:27.500368 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:36:31.800955 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:31.800918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:36:31.801315 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:31.800968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:36:31.803359 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:31.803339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1de4878-d335-41a9-a163-5332f8e575d6-cert\") pod \"ingress-canary-8njnj\" (UID: \"a1de4878-d335-41a9-a163-5332f8e575d6\") " pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:36:31.803421 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:31.803375 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e2d982-4c5d-4d80-8ce4-b86491c0f765-metrics-tls\") pod \"dns-default-swhzp\" (UID: \"74e2d982-4c5d-4d80-8ce4-b86491c0f765\") " pod="openshift-dns/dns-default-swhzp" Apr 22 15:36:32.007941 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.007912 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xgzsv\"" Apr 22 15:36:32.007941 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.007919 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fpsm4\"" Apr 22 15:36:32.012032 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.012011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-swhzp" Apr 22 15:36:32.012032 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.012024 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8njnj" Apr 22 15:36:32.048351 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.048323 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-542rl"] Apr 22 15:36:32.052400 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.052320 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" Apr 22 15:36:32.056475 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.056446 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 15:36:32.056819 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.056764 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8fdsc\"" Apr 22 15:36:32.056819 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.056780 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 15:36:32.072725 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.072678 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-542rl"] Apr 22 15:36:32.124623 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.124590 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77848d7596-wp72d"] Apr 22 15:36:32.132731 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.132698 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-s24ht"] Apr 22 15:36:32.134797 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.134766 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s24ht" Apr 22 15:36:32.140948 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.140925 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 15:36:32.141073 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.140971 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 15:36:32.141936 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.141211 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-87d75"] Apr 22 15:36:32.141936 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.141307 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-c85gk\"" Apr 22 15:36:32.143867 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.143844 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5"] Apr 22 15:36:32.144070 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.144051 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.146356 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.146336 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" Apr 22 15:36:32.147834 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.147815 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 15:36:32.147941 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.147847 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 15:36:32.147996 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.147972 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6mcjg\"" Apr 22 15:36:32.148111 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.148096 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 15:36:32.149783 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.149754 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 15:36:32.149927 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.149890 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-v6zmr\"" Apr 22 15:36:32.151012 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.150854 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 15:36:32.155121 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.155100 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s24ht"] Apr 22 15:36:32.163127 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.163108 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5"] Apr 22 15:36:32.168241 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.168223 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-87d75"] Apr 22 15:36:32.179047 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.179024 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8njnj"] Apr 22 15:36:32.181946 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:32.181920 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1de4878_d335_41a9_a163_5332f8e575d6.slice/crio-3257757cad73e6f498fb12ba419395c6f516889e04665aace9260f7f6b9f5d72 WatchSource:0}: Error finding container 3257757cad73e6f498fb12ba419395c6f516889e04665aace9260f7f6b9f5d72: Status 404 returned error can't find the container with id 3257757cad73e6f498fb12ba419395c6f516889e04665aace9260f7f6b9f5d72 Apr 22 15:36:32.191505 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:32.191483 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74e2d982_4c5d_4d80_8ce4_b86491c0f765.slice/crio-886d880b8609799ae172c532d3917e02002da319b142723007fc5ac286976c70 WatchSource:0}: Error finding container 886d880b8609799ae172c532d3917e02002da319b142723007fc5ac286976c70: Status 404 returned error can't find the container with id 886d880b8609799ae172c532d3917e02002da319b142723007fc5ac286976c70 Apr 22 15:36:32.191801 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.191787 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-swhzp"] Apr 22 15:36:32.192612 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.192592 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f98788744-gfqth"] Apr 22 15:36:32.194880 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.194863 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.203407 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.203376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f1bbe654-6603-4266-aaa8-3f2f9852be83-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-542rl\" (UID: \"f1bbe654-6603-4266-aaa8-3f2f9852be83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" Apr 22 15:36:32.203498 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.203434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f1bbe654-6603-4266-aaa8-3f2f9852be83-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-542rl\" (UID: \"f1bbe654-6603-4266-aaa8-3f2f9852be83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" Apr 22 15:36:32.208173 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.208155 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f98788744-gfqth"] Apr 22 15:36:32.303938 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.303843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t89fp\" (UniqueName: \"kubernetes.io/projected/3ebe0392-9655-4115-b039-44138ef50068-kube-api-access-t89fp\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.303938 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.303879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c7193304-0a5d-46fd-89e7-9c0b2192fa40-crio-socket\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.303938 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.303931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f1bbe654-6603-4266-aaa8-3f2f9852be83-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-542rl\" (UID: \"f1bbe654-6603-4266-aaa8-3f2f9852be83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" Apr 22 15:36:32.304198 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c7193304-0a5d-46fd-89e7-9c0b2192fa40-data-volume\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.304198 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c909177c-1f2c-45a6-b86e-d1ebca982be2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qltc5\" (UID: \"c909177c-1f2c-45a6-b86e-d1ebca982be2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" Apr 22 15:36:32.304198 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c7193304-0a5d-46fd-89e7-9c0b2192fa40-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.304198 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ebe0392-9655-4115-b039-44138ef50068-image-registry-private-configuration\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.304198 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f1bbe654-6603-4266-aaa8-3f2f9852be83-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-542rl\" (UID: \"f1bbe654-6603-4266-aaa8-3f2f9852be83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" Apr 22 15:36:32.304469 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304201 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ebe0392-9655-4115-b039-44138ef50068-registry-certificates\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.304469 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ebe0392-9655-4115-b039-44138ef50068-bound-sa-token\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.304469 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg9wj\" (UniqueName: \"kubernetes.io/projected/c7193304-0a5d-46fd-89e7-9c0b2192fa40-kube-api-access-bg9wj\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.304469 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304377 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ebe0392-9655-4115-b039-44138ef50068-registry-tls\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.304469 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ebe0392-9655-4115-b039-44138ef50068-installation-pull-secrets\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.304469 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304448 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ctg\" (UniqueName: \"kubernetes.io/projected/d15a5003-51ba-4bc8-a06b-2215afd58ed5-kube-api-access-d7ctg\") pod \"downloads-6bcc868b7-s24ht\" (UID: \"d15a5003-51ba-4bc8-a06b-2215afd58ed5\") " pod="openshift-console/downloads-6bcc868b7-s24ht" Apr 22 15:36:32.304744 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304471 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c7193304-0a5d-46fd-89e7-9c0b2192fa40-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.304744 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304501 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ebe0392-9655-4115-b039-44138ef50068-ca-trust-extracted\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.304744 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f1bbe654-6603-4266-aaa8-3f2f9852be83-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-542rl\" (UID: \"f1bbe654-6603-4266-aaa8-3f2f9852be83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" Apr 22 15:36:32.304744 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.304524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ebe0392-9655-4115-b039-44138ef50068-trusted-ca\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.306713 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.306691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f1bbe654-6603-4266-aaa8-3f2f9852be83-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-542rl\" (UID: \"f1bbe654-6603-4266-aaa8-3f2f9852be83\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" Apr 22 15:36:32.362712 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.362685 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.405934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c7193304-0a5d-46fd-89e7-9c0b2192fa40-data-volume\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.405989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c909177c-1f2c-45a6-b86e-d1ebca982be2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qltc5\" (UID: \"c909177c-1f2c-45a6-b86e-d1ebca982be2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406013 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c7193304-0a5d-46fd-89e7-9c0b2192fa40-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ebe0392-9655-4115-b039-44138ef50068-image-registry-private-configuration\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ebe0392-9655-4115-b039-44138ef50068-registry-certificates\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ebe0392-9655-4115-b039-44138ef50068-bound-sa-token\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bg9wj\" (UniqueName: \"kubernetes.io/projected/c7193304-0a5d-46fd-89e7-9c0b2192fa40-kube-api-access-bg9wj\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ebe0392-9655-4115-b039-44138ef50068-registry-tls\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ebe0392-9655-4115-b039-44138ef50068-installation-pull-secrets\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406155 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7ctg\" (UniqueName: \"kubernetes.io/projected/d15a5003-51ba-4bc8-a06b-2215afd58ed5-kube-api-access-d7ctg\") pod \"downloads-6bcc868b7-s24ht\" (UID: \"d15a5003-51ba-4bc8-a06b-2215afd58ed5\") " pod="openshift-console/downloads-6bcc868b7-s24ht" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c7193304-0a5d-46fd-89e7-9c0b2192fa40-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ebe0392-9655-4115-b039-44138ef50068-ca-trust-extracted\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ebe0392-9655-4115-b039-44138ef50068-trusted-ca\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406241 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t89fp\" (UniqueName: \"kubernetes.io/projected/3ebe0392-9655-4115-b039-44138ef50068-kube-api-access-t89fp\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c7193304-0a5d-46fd-89e7-9c0b2192fa40-crio-socket\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.407884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406392 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c7193304-0a5d-46fd-89e7-9c0b2192fa40-crio-socket\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.408782 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c7193304-0a5d-46fd-89e7-9c0b2192fa40-data-volume\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.408782 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.406946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3ebe0392-9655-4115-b039-44138ef50068-ca-trust-extracted\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.408782 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.407702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3ebe0392-9655-4115-b039-44138ef50068-registry-certificates\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.409353 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.409288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c7193304-0a5d-46fd-89e7-9c0b2192fa40-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.409558 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.409384 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ebe0392-9655-4115-b039-44138ef50068-trusted-ca\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.410845 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.410809 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3ebe0392-9655-4115-b039-44138ef50068-image-registry-private-configuration\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.411167 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.411143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3ebe0392-9655-4115-b039-44138ef50068-installation-pull-secrets\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.411312 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.411288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c909177c-1f2c-45a6-b86e-d1ebca982be2-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qltc5\" (UID: \"c909177c-1f2c-45a6-b86e-d1ebca982be2\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" Apr 22 15:36:32.411915 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.411871 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3ebe0392-9655-4115-b039-44138ef50068-registry-tls\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.412134 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.412079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c7193304-0a5d-46fd-89e7-9c0b2192fa40-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.421465 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.421440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t89fp\" (UniqueName: \"kubernetes.io/projected/3ebe0392-9655-4115-b039-44138ef50068-kube-api-access-t89fp\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.421541 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.421512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ebe0392-9655-4115-b039-44138ef50068-bound-sa-token\") pod \"image-registry-f98788744-gfqth\" (UID: \"3ebe0392-9655-4115-b039-44138ef50068\") " pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.423309 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.423287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg9wj\" (UniqueName: \"kubernetes.io/projected/c7193304-0a5d-46fd-89e7-9c0b2192fa40-kube-api-access-bg9wj\") pod \"insights-runtime-extractor-87d75\" (UID: \"c7193304-0a5d-46fd-89e7-9c0b2192fa40\") " pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.424244 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.424227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7ctg\" (UniqueName: \"kubernetes.io/projected/d15a5003-51ba-4bc8-a06b-2215afd58ed5-kube-api-access-d7ctg\") pod \"downloads-6bcc868b7-s24ht\" (UID: \"d15a5003-51ba-4bc8-a06b-2215afd58ed5\") " pod="openshift-console/downloads-6bcc868b7-s24ht" Apr 22 15:36:32.444857 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.444837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-s24ht" Apr 22 15:36:32.453773 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.453749 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-87d75" Apr 22 15:36:32.459563 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.459543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" Apr 22 15:36:32.484980 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.483638 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-542rl"] Apr 22 15:36:32.488978 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:32.488920 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1bbe654_6603_4266_aaa8_3f2f9852be83.slice/crio-94e3978f11364cc00f1704c48db41d187d4582c2f7a8ec03014f0d27d5943ed4 WatchSource:0}: Error finding container 94e3978f11364cc00f1704c48db41d187d4582c2f7a8ec03014f0d27d5943ed4: Status 404 returned error can't find the container with id 94e3978f11364cc00f1704c48db41d187d4582c2f7a8ec03014f0d27d5943ed4 Apr 22 15:36:32.505135 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.504688 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:32.513032 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.512782 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8njnj" event={"ID":"a1de4878-d335-41a9-a163-5332f8e575d6","Type":"ContainerStarted","Data":"3257757cad73e6f498fb12ba419395c6f516889e04665aace9260f7f6b9f5d72"} Apr 22 15:36:32.514546 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.514487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" event={"ID":"f1bbe654-6603-4266-aaa8-3f2f9852be83","Type":"ContainerStarted","Data":"94e3978f11364cc00f1704c48db41d187d4582c2f7a8ec03014f0d27d5943ed4"} Apr 22 15:36:32.516290 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.515930 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-swhzp" event={"ID":"74e2d982-4c5d-4d80-8ce4-b86491c0f765","Type":"ContainerStarted","Data":"886d880b8609799ae172c532d3917e02002da319b142723007fc5ac286976c70"} Apr 22 15:36:32.592303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.592243 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-s24ht"] Apr 22 15:36:32.597501 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:32.597466 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd15a5003_51ba_4bc8_a06b_2215afd58ed5.slice/crio-43406e66a6ad45ac5349d8fa24ea4d112fa5dc97358a51b8e2af2c5fb5ea3fe6 WatchSource:0}: Error finding container 43406e66a6ad45ac5349d8fa24ea4d112fa5dc97358a51b8e2af2c5fb5ea3fe6: Status 404 returned error can't find the container with id 43406e66a6ad45ac5349d8fa24ea4d112fa5dc97358a51b8e2af2c5fb5ea3fe6 Apr 22 15:36:32.674716 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.674663 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f98788744-gfqth"] Apr 22 15:36:32.678190 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:32.678147 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ebe0392_9655_4115_b039_44138ef50068.slice/crio-063601d410b8089a36ac3d86d286d2001428513022f3826119d46b448b5fffc0 WatchSource:0}: Error finding container 063601d410b8089a36ac3d86d286d2001428513022f3826119d46b448b5fffc0: Status 404 returned error can't find the container with id 063601d410b8089a36ac3d86d286d2001428513022f3826119d46b448b5fffc0 Apr 22 15:36:32.836539 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.836424 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5"] Apr 22 15:36:32.837648 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:32.837508 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-87d75"] Apr 22 15:36:32.842588 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:32.842396 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc909177c_1f2c_45a6_b86e_d1ebca982be2.slice/crio-c79fefbd1dede9ee9f60b3c4a588312a122b5ab942dbf71769348ce36aba0f64 WatchSource:0}: Error finding container c79fefbd1dede9ee9f60b3c4a588312a122b5ab942dbf71769348ce36aba0f64: Status 404 returned error can't find the container with id c79fefbd1dede9ee9f60b3c4a588312a122b5ab942dbf71769348ce36aba0f64 Apr 22 15:36:32.843700 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:32.843585 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7193304_0a5d_46fd_89e7_9c0b2192fa40.slice/crio-1fbe488a8118d8046ae8c2668378343d5f567e5bf6c8e7e101d58369320338dd WatchSource:0}: Error finding container 1fbe488a8118d8046ae8c2668378343d5f567e5bf6c8e7e101d58369320338dd: Status 404 returned error can't find the container with id 1fbe488a8118d8046ae8c2668378343d5f567e5bf6c8e7e101d58369320338dd Apr 22 15:36:33.520447 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:33.520405 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s24ht" event={"ID":"d15a5003-51ba-4bc8-a06b-2215afd58ed5","Type":"ContainerStarted","Data":"43406e66a6ad45ac5349d8fa24ea4d112fa5dc97358a51b8e2af2c5fb5ea3fe6"} Apr 22 15:36:33.522093 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:33.522036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f98788744-gfqth" event={"ID":"3ebe0392-9655-4115-b039-44138ef50068","Type":"ContainerStarted","Data":"2ca92bc1c05aa9eaf114df34f4b37806111e7e64bd0d596067af7611d1c54ae2"} Apr 22 15:36:33.522093 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:33.522071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f98788744-gfqth" event={"ID":"3ebe0392-9655-4115-b039-44138ef50068","Type":"ContainerStarted","Data":"063601d410b8089a36ac3d86d286d2001428513022f3826119d46b448b5fffc0"} Apr 22 15:36:33.522301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:33.522161 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:33.523391 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:33.523336 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" event={"ID":"c909177c-1f2c-45a6-b86e-d1ebca982be2","Type":"ContainerStarted","Data":"c79fefbd1dede9ee9f60b3c4a588312a122b5ab942dbf71769348ce36aba0f64"} Apr 22 15:36:33.524810 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:33.524789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87d75" event={"ID":"c7193304-0a5d-46fd-89e7-9c0b2192fa40","Type":"ContainerStarted","Data":"b1cc762d805b2018e4d943130ef0340f36a63246f7c177abebc8597c5cf6e8d9"} Apr 22 15:36:33.524937 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:33.524815 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87d75" event={"ID":"c7193304-0a5d-46fd-89e7-9c0b2192fa40","Type":"ContainerStarted","Data":"1fbe488a8118d8046ae8c2668378343d5f567e5bf6c8e7e101d58369320338dd"} Apr 22 15:36:33.545223 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:33.545174 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-f98788744-gfqth" podStartSLOduration=1.545158272 podStartE2EDuration="1.545158272s" podCreationTimestamp="2026-04-22 15:36:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:36:33.542993624 +0000 UTC m=+163.012544422" watchObservedRunningTime="2026-04-22 15:36:33.545158272 +0000 UTC m=+163.014709072" Apr 22 15:36:35.533216 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.532686 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" event={"ID":"f1bbe654-6603-4266-aaa8-3f2f9852be83","Type":"ContainerStarted","Data":"48b9149e44e534fce925327e1ea12b75c46626403a885f116267d449c7afb1d8"} Apr 22 15:36:35.535177 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.535140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-swhzp" event={"ID":"74e2d982-4c5d-4d80-8ce4-b86491c0f765","Type":"ContainerStarted","Data":"8701b57f123c49071adf00fb7b64b5f5fe17427ae62347dab80dac4535e351ec"} Apr 22 15:36:35.535318 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.535179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-swhzp" event={"ID":"74e2d982-4c5d-4d80-8ce4-b86491c0f765","Type":"ContainerStarted","Data":"405006ff771c847110192dbb5bc6ed5ade79a0856323e4a0a11d5becf86bf36f"} Apr 22 15:36:35.535318 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.535244 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-swhzp" Apr 22 15:36:35.536809 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.536769 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8njnj" event={"ID":"a1de4878-d335-41a9-a163-5332f8e575d6","Type":"ContainerStarted","Data":"cfcb3deeffcfdb67b6a7674347f6f5c66cf8d1415b4406cfa87011a03d85de26"} Apr 22 15:36:35.538539 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.538506 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" event={"ID":"c909177c-1f2c-45a6-b86e-d1ebca982be2","Type":"ContainerStarted","Data":"49d1e8b2bffd294fc821ab07fe3124f526fe204d3e0729139e526c4c6dd3c51b"} Apr 22 15:36:35.538830 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.538807 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" Apr 22 15:36:35.540619 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.540552 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87d75" event={"ID":"c7193304-0a5d-46fd-89e7-9c0b2192fa40","Type":"ContainerStarted","Data":"5bce45d702ffd7e043767a1bb5cf504a9e0bc1e2537889e25905ff7852879eab"} Apr 22 15:36:35.544997 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.544961 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" Apr 22 15:36:35.547461 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.547416 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-542rl" podStartSLOduration=1.262365033 podStartE2EDuration="3.547401756s" podCreationTimestamp="2026-04-22 15:36:32 +0000 UTC" firstStartedPulling="2026-04-22 15:36:32.491115818 +0000 UTC m=+161.960666596" lastFinishedPulling="2026-04-22 15:36:34.776152538 +0000 UTC m=+164.245703319" observedRunningTime="2026-04-22 15:36:35.547011358 +0000 UTC m=+165.016562160" watchObservedRunningTime="2026-04-22 15:36:35.547401756 +0000 UTC m=+165.016952557" Apr 22 15:36:35.564051 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.564001 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8njnj" podStartSLOduration=129.862013066 podStartE2EDuration="2m12.563985663s" podCreationTimestamp="2026-04-22 15:34:23 +0000 UTC" firstStartedPulling="2026-04-22 15:36:32.1842195 +0000 UTC m=+161.653770292" lastFinishedPulling="2026-04-22 15:36:34.886191927 +0000 UTC m=+164.355742889" observedRunningTime="2026-04-22 15:36:35.562412514 +0000 UTC m=+165.031963316" watchObservedRunningTime="2026-04-22 15:36:35.563985663 +0000 UTC m=+165.033536463" Apr 22 15:36:35.578500 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.578452 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-swhzp" podStartSLOduration=129.995453538 podStartE2EDuration="2m12.578435481s" podCreationTimestamp="2026-04-22 15:34:23 +0000 UTC" firstStartedPulling="2026-04-22 15:36:32.193164255 +0000 UTC m=+161.662715037" lastFinishedPulling="2026-04-22 15:36:34.776146201 +0000 UTC m=+164.245696980" observedRunningTime="2026-04-22 15:36:35.577708757 +0000 UTC m=+165.047259557" watchObservedRunningTime="2026-04-22 15:36:35.578435481 +0000 UTC m=+165.047986282" Apr 22 15:36:35.593696 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:35.593304 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qltc5" podStartSLOduration=1.547180493 podStartE2EDuration="3.593291982s" podCreationTimestamp="2026-04-22 15:36:32 +0000 UTC" firstStartedPulling="2026-04-22 15:36:32.844374628 +0000 UTC m=+162.313925404" lastFinishedPulling="2026-04-22 15:36:34.890486116 +0000 UTC m=+164.360036893" observedRunningTime="2026-04-22 15:36:35.591888881 +0000 UTC m=+165.061439682" watchObservedRunningTime="2026-04-22 15:36:35.593291982 +0000 UTC m=+165.062842784" Apr 22 15:36:36.547642 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:36.547603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87d75" event={"ID":"c7193304-0a5d-46fd-89e7-9c0b2192fa40","Type":"ContainerStarted","Data":"01e92252bd614ebb1ad671b735b39b0cd90ec2860784d118ea130579bf03336d"} Apr 22 15:36:36.565439 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:36.565395 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-87d75" podStartSLOduration=1.156718033 podStartE2EDuration="4.565382146s" podCreationTimestamp="2026-04-22 15:36:32 +0000 UTC" firstStartedPulling="2026-04-22 15:36:32.913162593 +0000 UTC m=+162.382713378" lastFinishedPulling="2026-04-22 15:36:36.321826709 +0000 UTC m=+165.791377491" observedRunningTime="2026-04-22 15:36:36.5645949 +0000 UTC m=+166.034145700" watchObservedRunningTime="2026-04-22 15:36:36.565382146 +0000 UTC m=+166.034932944" Apr 22 15:36:38.081100 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:38.081061 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:36:41.489658 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.489562 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp"] Apr 22 15:36:41.492102 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.492083 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.495763 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.495737 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 15:36:41.495876 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.495789 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 15:36:41.496088 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.496068 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8x7kq\"" Apr 22 15:36:41.496747 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.496725 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 15:36:41.496827 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.496765 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 15:36:41.497651 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.497634 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 15:36:41.508254 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.508231 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp"] Apr 22 15:36:41.534508 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.534480 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2g6b8"] Apr 22 15:36:41.537029 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.537011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.541150 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.541130 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 15:36:41.541242 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.541173 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 15:36:41.541354 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.541337 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pqtf6\"" Apr 22 15:36:41.541442 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.541364 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 15:36:41.582465 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-root\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.582583 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb48h\" (UniqueName: \"kubernetes.io/projected/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-kube-api-access-hb48h\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.582583 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-sys\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.582583 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.582729 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-metrics-client-ca\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.582729 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.582798 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582750 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs7gt\" (UniqueName: \"kubernetes.io/projected/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-kube-api-access-cs7gt\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.582798 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-tls\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.582872 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-textfile\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.582872 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.582966 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582879 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-accelerators-collector-config\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.582966 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.583041 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.582991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-wtmp\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683525 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-root\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683659 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683546 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb48h\" (UniqueName: \"kubernetes.io/projected/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-kube-api-access-hb48h\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.683659 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-root\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683659 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-sys\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683659 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.683659 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-sys\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683659 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-metrics-client-ca\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683947 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.683947 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cs7gt\" (UniqueName: \"kubernetes.io/projected/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-kube-api-access-cs7gt\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683947 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-tls\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683947 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-textfile\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683947 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683947 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-accelerators-collector-config\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.683947 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683832 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.683947 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.683884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-wtmp\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.684313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.684055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-wtmp\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.684313 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:41.684068 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 15:36:41.684313 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:41.684142 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-tls podName:0e7ee1be-fd13-4aac-b8e4-c8d8f5664851 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:42.184119757 +0000 UTC m=+171.653670555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-tls") pod "node-exporter-2g6b8" (UID: "0e7ee1be-fd13-4aac-b8e4-c8d8f5664851") : secret "node-exporter-tls" not found Apr 22 15:36:41.684313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.684159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-textfile\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.684313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.684278 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-metrics-client-ca\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.684567 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.684441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.684624 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.684556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-accelerators-collector-config\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.687008 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.686987 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.687140 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.687016 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.687263 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.687243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.695652 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.695626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb48h\" (UniqueName: \"kubernetes.io/projected/0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4-kube-api-access-hb48h\") pod \"openshift-state-metrics-9d44df66c-kpckp\" (UID: \"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.695773 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.695752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs7gt\" (UniqueName: \"kubernetes.io/projected/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-kube-api-access-cs7gt\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:41.803102 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.803036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" Apr 22 15:36:41.933416 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:41.933385 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp"] Apr 22 15:36:41.936037 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:41.936005 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ed083c6_2f3b_4a6b_8e4e_1663652dd2b4.slice/crio-dca395d0458b870152151dd3e59f772e7bea17b99fbd2bf40170340b7a17df02 WatchSource:0}: Error finding container dca395d0458b870152151dd3e59f772e7bea17b99fbd2bf40170340b7a17df02: Status 404 returned error can't find the container with id dca395d0458b870152151dd3e59f772e7bea17b99fbd2bf40170340b7a17df02 Apr 22 15:36:42.131808 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.131762 2577 patch_prober.go:28] interesting pod/image-registry-77848d7596-wp72d container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:36:42.131982 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.131820 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-77848d7596-wp72d" podUID="54f56349-cf1e-4efa-b6f2-ff228d7570a3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:36:42.189724 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.189699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-tls\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:42.192328 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.192305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0e7ee1be-fd13-4aac-b8e4-c8d8f5664851-node-exporter-tls\") pod \"node-exporter-2g6b8\" (UID: \"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851\") " pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:42.447549 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.447459 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2g6b8" Apr 22 15:36:42.455747 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:42.455722 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e7ee1be_fd13_4aac_b8e4_c8d8f5664851.slice/crio-abbcc8df0879f1e4ac03dff77bb1b1107ebe0f1f9d0e27ba5a06d2e6cb85d063 WatchSource:0}: Error finding container abbcc8df0879f1e4ac03dff77bb1b1107ebe0f1f9d0e27ba5a06d2e6cb85d063: Status 404 returned error can't find the container with id abbcc8df0879f1e4ac03dff77bb1b1107ebe0f1f9d0e27ba5a06d2e6cb85d063 Apr 22 15:36:42.565104 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.565051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2g6b8" event={"ID":"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851","Type":"ContainerStarted","Data":"abbcc8df0879f1e4ac03dff77bb1b1107ebe0f1f9d0e27ba5a06d2e6cb85d063"} Apr 22 15:36:42.566812 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.566786 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" event={"ID":"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4","Type":"ContainerStarted","Data":"620ac119c8e3dc947fdde7e6f77121e2de2ce6a9d40a279e213fad6bad0c155e"} Apr 22 15:36:42.566953 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.566817 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" event={"ID":"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4","Type":"ContainerStarted","Data":"704128f08401782e6a813eca29a01636005c368ee2bff82436614b6477e615a1"} Apr 22 15:36:42.566953 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.566833 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" event={"ID":"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4","Type":"ContainerStarted","Data":"dca395d0458b870152151dd3e59f772e7bea17b99fbd2bf40170340b7a17df02"} Apr 22 15:36:42.608014 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.607439 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:36:42.610932 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.610883 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.614207 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.614187 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 15:36:42.614301 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.614190 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-jg8qk\"" Apr 22 15:36:42.614362 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.614320 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 15:36:42.614828 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.614577 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 15:36:42.614828 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.614761 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 15:36:42.614998 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.614864 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 15:36:42.614998 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.614891 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 15:36:42.615107 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.615048 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 15:36:42.615158 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.615115 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 15:36:42.615208 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.615168 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 15:36:42.627998 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.627975 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:36:42.693166 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693309 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693309 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693309 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693240 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-web-config\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693309 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693276 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693504 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693504 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693504 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693647 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mth5k\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-kube-api-access-mth5k\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693647 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-config-volume\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693647 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693647 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693595 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.693647 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.693634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-config-out\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794107 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mth5k\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-kube-api-access-mth5k\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794270 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-config-volume\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794270 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794270 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794270 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-config-out\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794270 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794503 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794503 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794503 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-web-config\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794503 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794378 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794503 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794402 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794503 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794503 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.794478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.794827 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:42.794603 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-trusted-ca-bundle podName:b9379052-cac4-4c31-819f-c98344e62728 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:43.294588724 +0000 UTC m=+172.764139500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "b9379052-cac4-4c31-819f-c98344e62728") : configmap references non-existent config key: ca-bundle.crt Apr 22 15:36:42.794827 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:42.794804 2577 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 15:36:42.794966 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:42.794856 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-main-tls podName:b9379052-cac4-4c31-819f-c98344e62728 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:43.294838232 +0000 UTC m=+172.764389026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "b9379052-cac4-4c31-819f-c98344e62728") : secret "alertmanager-main-tls" not found Apr 22 15:36:42.796384 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.796131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.797549 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.797495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.797952 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.797881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-config-volume\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.798065 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.798006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.798208 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.798187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.798622 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.798599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-web-config\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.799438 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.799123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.799438 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.799367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.799572 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.799439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.800006 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.799990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-config-out\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:42.802447 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:42.802421 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mth5k\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-kube-api-access-mth5k\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:43.298235 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:43.298201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:43.298396 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:43.298258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:43.299109 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:43.299080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:43.300571 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:43.300554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:43.521490 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:43.521452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:36:43.572873 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:43.572776 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" event={"ID":"0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4","Type":"ContainerStarted","Data":"e4b15c19285c5903ce819151099b8361605a5e74cba406e058955cc913024703"} Apr 22 15:36:43.595218 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:43.594870 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-kpckp" podStartSLOduration=1.695810461 podStartE2EDuration="2.594849847s" podCreationTimestamp="2026-04-22 15:36:41 +0000 UTC" firstStartedPulling="2026-04-22 15:36:42.05749214 +0000 UTC m=+171.527042918" lastFinishedPulling="2026-04-22 15:36:42.956531523 +0000 UTC m=+172.426082304" observedRunningTime="2026-04-22 15:36:43.59304879 +0000 UTC m=+173.062599613" watchObservedRunningTime="2026-04-22 15:36:43.594849847 +0000 UTC m=+173.064400650" Apr 22 15:36:43.672063 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:43.672036 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:36:43.675113 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:36:43.675084 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9379052_cac4_4c31_819f_c98344e62728.slice/crio-1973a9a1968c86bc62fcf4b28752959d78ffda1eec134bb33459b8fd3f17cd41 WatchSource:0}: Error finding container 1973a9a1968c86bc62fcf4b28752959d78ffda1eec134bb33459b8fd3f17cd41: Status 404 returned error can't find the container with id 1973a9a1968c86bc62fcf4b28752959d78ffda1eec134bb33459b8fd3f17cd41 Apr 22 15:36:44.577576 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:44.577540 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerStarted","Data":"1973a9a1968c86bc62fcf4b28752959d78ffda1eec134bb33459b8fd3f17cd41"} Apr 22 15:36:44.579122 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:44.579091 2577 generic.go:358] "Generic (PLEG): container finished" podID="0e7ee1be-fd13-4aac-b8e4-c8d8f5664851" containerID="acc437008e4f9bbd2b622288ab471e35663a42230112fe1b96942bd36e5f7b1a" exitCode=0 Apr 22 15:36:44.579561 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:44.579294 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2g6b8" event={"ID":"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851","Type":"ContainerDied","Data":"acc437008e4f9bbd2b622288ab471e35663a42230112fe1b96942bd36e5f7b1a"} Apr 22 15:36:45.549790 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:45.549761 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-swhzp" Apr 22 15:36:45.584473 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:45.584440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2g6b8" event={"ID":"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851","Type":"ContainerStarted","Data":"3c468c49625ed2f089b43b4bd84a789020ed7ded3d6b576e126246fea9854ea9"} Apr 22 15:36:45.584926 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:45.584481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2g6b8" event={"ID":"0e7ee1be-fd13-4aac-b8e4-c8d8f5664851","Type":"ContainerStarted","Data":"d059c2763d8dec3359e00f5dcf3cc32ae963e80fc431761ab1ca9ad84532003a"} Apr 22 15:36:45.586598 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:45.586570 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9379052-cac4-4c31-819f-c98344e62728" containerID="08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db" exitCode=0 Apr 22 15:36:45.586713 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:45.586636 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerDied","Data":"08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db"} Apr 22 15:36:45.604874 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:45.604819 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2g6b8" podStartSLOduration=3.131262611 podStartE2EDuration="4.604803502s" podCreationTimestamp="2026-04-22 15:36:41 +0000 UTC" firstStartedPulling="2026-04-22 15:36:42.457739872 +0000 UTC m=+171.927290651" lastFinishedPulling="2026-04-22 15:36:43.931280754 +0000 UTC m=+173.400831542" observedRunningTime="2026-04-22 15:36:45.604074123 +0000 UTC m=+175.073624923" watchObservedRunningTime="2026-04-22 15:36:45.604803502 +0000 UTC m=+175.074354300" Apr 22 15:36:52.129710 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:52.129674 2577 patch_prober.go:28] interesting pod/image-registry-77848d7596-wp72d container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:36:52.130191 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:52.129729 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-77848d7596-wp72d" podUID="54f56349-cf1e-4efa-b6f2-ff228d7570a3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:36:52.509206 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:52.509169 2577 patch_prober.go:28] interesting pod/image-registry-f98788744-gfqth container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:36:52.509383 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:52.509222 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-f98788744-gfqth" podUID="3ebe0392-9655-4115-b039-44138ef50068" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:36:52.614058 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:52.613845 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-s24ht" event={"ID":"d15a5003-51ba-4bc8-a06b-2215afd58ed5","Type":"ContainerStarted","Data":"76291abf0bd71815db4fecf0990542bcf3c4c72cee6c7a9ae3b3d6ab2cd7af87"} Apr 22 15:36:52.614371 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:52.614352 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-s24ht" Apr 22 15:36:52.631099 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:52.631074 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-s24ht" Apr 22 15:36:52.651124 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:52.651024 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-s24ht" podStartSLOduration=0.900890615 podStartE2EDuration="20.651005982s" podCreationTimestamp="2026-04-22 15:36:32 +0000 UTC" firstStartedPulling="2026-04-22 15:36:32.601090716 +0000 UTC m=+162.070641506" lastFinishedPulling="2026-04-22 15:36:52.351206096 +0000 UTC m=+181.820756873" observedRunningTime="2026-04-22 15:36:52.650566564 +0000 UTC m=+182.120117367" watchObservedRunningTime="2026-04-22 15:36:52.651005982 +0000 UTC m=+182.120556782" Apr 22 15:36:54.533575 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:54.533544 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-f98788744-gfqth" Apr 22 15:36:54.639842 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:54.639789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerStarted","Data":"bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689"} Apr 22 15:36:54.640038 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:54.639851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerStarted","Data":"6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec"} Apr 22 15:36:54.640038 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:54.639870 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerStarted","Data":"21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e"} Apr 22 15:36:54.640038 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:54.639887 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerStarted","Data":"cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f"} Apr 22 15:36:54.640038 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:54.639921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerStarted","Data":"3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9"} Apr 22 15:36:55.646797 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:55.646711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerStarted","Data":"3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c"} Apr 22 15:36:55.676582 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:55.676526 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.96557703 podStartE2EDuration="13.676511271s" podCreationTimestamp="2026-04-22 15:36:42 +0000 UTC" firstStartedPulling="2026-04-22 15:36:43.677763521 +0000 UTC m=+173.147314303" lastFinishedPulling="2026-04-22 15:36:55.388697767 +0000 UTC m=+184.858248544" observedRunningTime="2026-04-22 15:36:55.674106573 +0000 UTC m=+185.143657385" watchObservedRunningTime="2026-04-22 15:36:55.676511271 +0000 UTC m=+185.146062069" Apr 22 15:36:57.146340 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.146278 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-77848d7596-wp72d" podUID="54f56349-cf1e-4efa-b6f2-ff228d7570a3" containerName="registry" containerID="cri-o://4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404" gracePeriod=30 Apr 22 15:36:57.413357 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.413334 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:57.526114 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526080 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-bound-sa-token\") pod \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " Apr 22 15:36:57.526281 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526166 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-certificates\") pod \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " Apr 22 15:36:57.526281 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526196 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") pod \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " Apr 22 15:36:57.526281 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526240 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54f56349-cf1e-4efa-b6f2-ff228d7570a3-ca-trust-extracted\") pod \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " Apr 22 15:36:57.526281 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526266 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b4ls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-kube-api-access-9b4ls\") pod \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " Apr 22 15:36:57.526585 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526551 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "54f56349-cf1e-4efa-b6f2-ff228d7570a3" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:36:57.526697 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526636 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-trusted-ca\") pod \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " Apr 22 15:36:57.526697 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526688 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-image-registry-private-configuration\") pod \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " Apr 22 15:36:57.527006 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526708 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-installation-pull-secrets\") pod \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\" (UID: \"54f56349-cf1e-4efa-b6f2-ff228d7570a3\") " Apr 22 15:36:57.527006 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526870 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "54f56349-cf1e-4efa-b6f2-ff228d7570a3" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:36:57.527006 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526975 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-certificates\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:36:57.527006 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.526996 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54f56349-cf1e-4efa-b6f2-ff228d7570a3-trusted-ca\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:36:57.528742 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.528695 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "54f56349-cf1e-4efa-b6f2-ff228d7570a3" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:36:57.528891 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.528864 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "54f56349-cf1e-4efa-b6f2-ff228d7570a3" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:36:57.529014 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.528986 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-kube-api-access-9b4ls" (OuterVolumeSpecName: "kube-api-access-9b4ls") pod "54f56349-cf1e-4efa-b6f2-ff228d7570a3" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3"). InnerVolumeSpecName "kube-api-access-9b4ls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:36:57.529349 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.529317 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "54f56349-cf1e-4efa-b6f2-ff228d7570a3" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:36:57.529484 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.529462 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "54f56349-cf1e-4efa-b6f2-ff228d7570a3" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:36:57.538592 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.538564 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f56349-cf1e-4efa-b6f2-ff228d7570a3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "54f56349-cf1e-4efa-b6f2-ff228d7570a3" (UID: "54f56349-cf1e-4efa-b6f2-ff228d7570a3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:36:57.628032 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.627991 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-bound-sa-token\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:36:57.628032 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.628031 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-registry-tls\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:36:57.628232 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.628046 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54f56349-cf1e-4efa-b6f2-ff228d7570a3-ca-trust-extracted\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:36:57.628232 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.628062 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9b4ls\" (UniqueName: \"kubernetes.io/projected/54f56349-cf1e-4efa-b6f2-ff228d7570a3-kube-api-access-9b4ls\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:36:57.628232 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.628074 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-image-registry-private-configuration\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:36:57.628232 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.628083 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54f56349-cf1e-4efa-b6f2-ff228d7570a3-installation-pull-secrets\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:36:57.654663 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.654587 2577 generic.go:358] "Generic (PLEG): container finished" podID="54f56349-cf1e-4efa-b6f2-ff228d7570a3" containerID="4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404" exitCode=0 Apr 22 15:36:57.654663 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.654657 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-77848d7596-wp72d" Apr 22 15:36:57.654837 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.654663 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77848d7596-wp72d" event={"ID":"54f56349-cf1e-4efa-b6f2-ff228d7570a3","Type":"ContainerDied","Data":"4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404"} Apr 22 15:36:57.654837 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.654695 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-77848d7596-wp72d" event={"ID":"54f56349-cf1e-4efa-b6f2-ff228d7570a3","Type":"ContainerDied","Data":"35205faf185bd5ba5ac45dc391bbc18a7179d479e2d53ea3003a628190e20218"} Apr 22 15:36:57.654837 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.654715 2577 scope.go:117] "RemoveContainer" containerID="4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404" Apr 22 15:36:57.664227 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.664205 2577 scope.go:117] "RemoveContainer" containerID="4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404" Apr 22 15:36:57.664529 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:36:57.664505 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404\": container with ID starting with 4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404 not found: ID does not exist" containerID="4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404" Apr 22 15:36:57.664618 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.664541 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404"} err="failed to get container status \"4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404\": rpc error: code = NotFound desc = could not find container \"4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404\": container with ID starting with 4cc41f5ec8fdf36c1454080a21e1f6ae66eff86c9720739cc82a0e5ed3ef9404 not found: ID does not exist" Apr 22 15:36:57.683401 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.682942 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-77848d7596-wp72d"] Apr 22 15:36:57.684591 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:57.684568 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-77848d7596-wp72d"] Apr 22 15:36:59.085801 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:36:59.085765 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54f56349-cf1e-4efa-b6f2-ff228d7570a3" path="/var/lib/kubelet/pods/54f56349-cf1e-4efa-b6f2-ff228d7570a3/volumes" Apr 22 15:37:18.717496 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:37:18.717464 2577 generic.go:358] "Generic (PLEG): container finished" podID="31ebe0d7-a716-4a01-85dd-f78472a7c41b" containerID="428b3ae6a295f7f06a8e0f9639cf4dea5186bd6597bce28f1befee889be6ac74" exitCode=0 Apr 22 15:37:18.717865 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:37:18.717504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" event={"ID":"31ebe0d7-a716-4a01-85dd-f78472a7c41b","Type":"ContainerDied","Data":"428b3ae6a295f7f06a8e0f9639cf4dea5186bd6597bce28f1befee889be6ac74"} Apr 22 15:37:18.717865 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:37:18.717750 2577 scope.go:117] "RemoveContainer" containerID="428b3ae6a295f7f06a8e0f9639cf4dea5186bd6597bce28f1befee889be6ac74" Apr 22 15:37:19.721848 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:37:19.721813 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lhbsj" event={"ID":"31ebe0d7-a716-4a01-85dd-f78472a7c41b","Type":"ContainerStarted","Data":"0a95b469cce9e90cd04b92730bdcc908c1d1ebc8ba2100165f5eb76edb6ec0cf"} Apr 22 15:38:01.785076 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:01.785041 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:38:01.785531 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:01.785429 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="alertmanager" containerID="cri-o://3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9" gracePeriod=120 Apr 22 15:38:01.785601 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:01.785529 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="config-reloader" containerID="cri-o://cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f" gracePeriod=120 Apr 22 15:38:01.785601 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:01.785528 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy-web" containerID="cri-o://21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e" gracePeriod=120 Apr 22 15:38:01.785601 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:01.785502 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy-metric" containerID="cri-o://bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689" gracePeriod=120 Apr 22 15:38:01.785601 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:01.785553 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy" containerID="cri-o://6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec" gracePeriod=120 Apr 22 15:38:01.785775 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:01.785554 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="prom-label-proxy" containerID="cri-o://3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c" gracePeriod=120 Apr 22 15:38:02.803078 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.803047 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:38:02.805341 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.805323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7958036-9067-47bb-91dc-dc565feb289a-metrics-certs\") pod \"network-metrics-daemon-zzcmv\" (UID: \"f7958036-9067-47bb-91dc-dc565feb289a\") " pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:38:02.842767 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.842738 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9379052-cac4-4c31-819f-c98344e62728" containerID="3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c" exitCode=0 Apr 22 15:38:02.842767 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.842760 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9379052-cac4-4c31-819f-c98344e62728" containerID="6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec" exitCode=0 Apr 22 15:38:02.842767 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.842768 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9379052-cac4-4c31-819f-c98344e62728" containerID="cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f" exitCode=0 Apr 22 15:38:02.842989 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.842774 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9379052-cac4-4c31-819f-c98344e62728" containerID="3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9" exitCode=0 Apr 22 15:38:02.842989 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.842813 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerDied","Data":"3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c"} Apr 22 15:38:02.842989 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.842851 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerDied","Data":"6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec"} Apr 22 15:38:02.842989 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.842865 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerDied","Data":"cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f"} Apr 22 15:38:02.842989 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.842878 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerDied","Data":"3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9"} Apr 22 15:38:02.984825 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.984798 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxbwk\"" Apr 22 15:38:02.993263 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:02.993246 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzcmv" Apr 22 15:38:03.017938 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.017892 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:03.104832 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.104778 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-metrics-client-ca\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.104832 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.104814 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-trusted-ca-bundle\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.104832 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.104833 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-config-volume\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105138 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.104866 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-tls-assets\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105138 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.104883 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-metric\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105138 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.104929 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-config-out\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105138 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.104967 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-main-db\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105138 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.104992 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105138 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.105053 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-web-config\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105138 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.105080 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-web\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105138 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.105113 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-cluster-tls-config\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105645 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.105152 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mth5k\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-kube-api-access-mth5k\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105645 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.105188 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-main-tls\") pod \"b9379052-cac4-4c31-819f-c98344e62728\" (UID: \"b9379052-cac4-4c31-819f-c98344e62728\") " Apr 22 15:38:03.105645 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.105300 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:03.105645 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.105393 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-metrics-client-ca\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.106035 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.105662 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:38:03.106035 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.105878 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:03.108483 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.108460 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-kube-api-access-mth5k" (OuterVolumeSpecName: "kube-api-access-mth5k") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "kube-api-access-mth5k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:38:03.108597 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.108496 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:03.108659 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.108599 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-config-out" (OuterVolumeSpecName: "config-out") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:38:03.108724 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.108679 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:03.109251 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.109221 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:03.109624 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.109591 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:38:03.109719 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.109683 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:03.109835 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.109817 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:03.113470 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.113442 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:03.117984 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.117963 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zzcmv"] Apr 22 15:38:03.119075 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.119050 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-web-config" (OuterVolumeSpecName: "web-config") pod "b9379052-cac4-4c31-819f-c98344e62728" (UID: "b9379052-cac4-4c31-819f-c98344e62728"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:03.121720 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:38:03.121699 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7958036_9067_47bb_91dc_dc565feb289a.slice/crio-7129628906b118a32c60913de1c97dfed0a100ab48467ca971fd2db9c302fd58 WatchSource:0}: Error finding container 7129628906b118a32c60913de1c97dfed0a100ab48467ca971fd2db9c302fd58: Status 404 returned error can't find the container with id 7129628906b118a32c60913de1c97dfed0a100ab48467ca971fd2db9c302fd58 Apr 22 15:38:03.205935 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.205912 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-config-out\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.205935 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.205933 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-main-db\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.205944 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.205961 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-web-config\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.205974 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.205983 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-cluster-tls-config\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.205991 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mth5k\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-kube-api-access-mth5k\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.206000 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-main-tls\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.206008 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9379052-cac4-4c31-819f-c98344e62728-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.206016 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-config-volume\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.206024 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9379052-cac4-4c31-819f-c98344e62728-tls-assets\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.206053 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.206033 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b9379052-cac4-4c31-819f-c98344e62728-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:38:03.847242 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.847207 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zzcmv" event={"ID":"f7958036-9067-47bb-91dc-dc565feb289a","Type":"ContainerStarted","Data":"7129628906b118a32c60913de1c97dfed0a100ab48467ca971fd2db9c302fd58"} Apr 22 15:38:03.850505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.850470 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9379052-cac4-4c31-819f-c98344e62728" containerID="bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689" exitCode=0 Apr 22 15:38:03.850505 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.850497 2577 generic.go:358] "Generic (PLEG): container finished" podID="b9379052-cac4-4c31-819f-c98344e62728" containerID="21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e" exitCode=0 Apr 22 15:38:03.850676 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.850548 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerDied","Data":"bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689"} Apr 22 15:38:03.850676 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.850589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerDied","Data":"21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e"} Apr 22 15:38:03.850676 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.850604 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b9379052-cac4-4c31-819f-c98344e62728","Type":"ContainerDied","Data":"1973a9a1968c86bc62fcf4b28752959d78ffda1eec134bb33459b8fd3f17cd41"} Apr 22 15:38:03.850676 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.850623 2577 scope.go:117] "RemoveContainer" containerID="3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c" Apr 22 15:38:03.850676 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.850649 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:03.873964 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.873941 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:38:03.879750 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.879728 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:38:03.905034 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905013 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:38:03.905257 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905245 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="init-config-reloader" Apr 22 15:38:03.905313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905258 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="init-config-reloader" Apr 22 15:38:03.905313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905277 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="alertmanager" Apr 22 15:38:03.905313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905282 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="alertmanager" Apr 22 15:38:03.905313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905288 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy-metric" Apr 22 15:38:03.905313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905294 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy-metric" Apr 22 15:38:03.905313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905299 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="config-reloader" Apr 22 15:38:03.905313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905304 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="config-reloader" Apr 22 15:38:03.905313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905309 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="prom-label-proxy" Apr 22 15:38:03.905313 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905314 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="prom-label-proxy" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905320 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy-web" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905325 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy-web" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905333 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905338 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905345 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54f56349-cf1e-4efa-b6f2-ff228d7570a3" containerName="registry" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905350 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f56349-cf1e-4efa-b6f2-ff228d7570a3" containerName="registry" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905398 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy-web" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905408 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="alertmanager" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905415 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy-metric" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905420 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="prom-label-proxy" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905427 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="config-reloader" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905433 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9379052-cac4-4c31-819f-c98344e62728" containerName="kube-rbac-proxy" Apr 22 15:38:03.905643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.905439 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="54f56349-cf1e-4efa-b6f2-ff228d7570a3" containerName="registry" Apr 22 15:38:03.909143 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.909126 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:03.911212 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.911181 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 15:38:03.911317 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.911256 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 15:38:03.911808 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.911786 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 15:38:03.911917 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.911807 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 15:38:03.912355 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.912207 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 15:38:03.912355 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.912280 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 15:38:03.912355 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.912280 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-jg8qk\"" Apr 22 15:38:03.912553 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.912283 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 15:38:03.912767 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.912745 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 15:38:03.916950 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.916740 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 15:38:03.922100 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.922081 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:38:03.989771 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.989612 2577 scope.go:117] "RemoveContainer" containerID="bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689" Apr 22 15:38:03.996664 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:03.996646 2577 scope.go:117] "RemoveContainer" containerID="6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec" Apr 22 15:38:04.002942 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.002891 2577 scope.go:117] "RemoveContainer" containerID="21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e" Apr 22 15:38:04.008821 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.008802 2577 scope.go:117] "RemoveContainer" containerID="cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f" Apr 22 15:38:04.013143 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8040e9a9-11fe-47b1-9008-0b52af51bfc0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013230 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-config-volume\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013230 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013230 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013374 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgxl6\" (UniqueName: \"kubernetes.io/projected/8040e9a9-11fe-47b1-9008-0b52af51bfc0-kube-api-access-sgxl6\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013374 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013374 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013472 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8040e9a9-11fe-47b1-9008-0b52af51bfc0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013472 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013437 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-web-config\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013472 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013464 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8040e9a9-11fe-47b1-9008-0b52af51bfc0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013575 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013480 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8040e9a9-11fe-47b1-9008-0b52af51bfc0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013575 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.013575 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.013529 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8040e9a9-11fe-47b1-9008-0b52af51bfc0-config-out\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.033269 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.033245 2577 scope.go:117] "RemoveContainer" containerID="3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9" Apr 22 15:38:04.041225 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.041153 2577 scope.go:117] "RemoveContainer" containerID="08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db" Apr 22 15:38:04.049162 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.049146 2577 scope.go:117] "RemoveContainer" containerID="3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c" Apr 22 15:38:04.049409 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:38:04.049390 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c\": container with ID starting with 3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c not found: ID does not exist" containerID="3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c" Apr 22 15:38:04.049495 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.049419 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c"} err="failed to get container status \"3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c\": rpc error: code = NotFound desc = could not find container \"3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c\": container with ID starting with 3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c not found: ID does not exist" Apr 22 15:38:04.049495 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.049448 2577 scope.go:117] "RemoveContainer" containerID="bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689" Apr 22 15:38:04.049668 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:38:04.049652 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689\": container with ID starting with bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689 not found: ID does not exist" containerID="bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689" Apr 22 15:38:04.049708 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.049674 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689"} err="failed to get container status \"bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689\": rpc error: code = NotFound desc = could not find container \"bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689\": container with ID starting with bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689 not found: ID does not exist" Apr 22 15:38:04.049708 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.049691 2577 scope.go:117] "RemoveContainer" containerID="6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec" Apr 22 15:38:04.049889 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:38:04.049872 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec\": container with ID starting with 6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec not found: ID does not exist" containerID="6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec" Apr 22 15:38:04.049955 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.049919 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec"} err="failed to get container status \"6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec\": rpc error: code = NotFound desc = could not find container \"6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec\": container with ID starting with 6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec not found: ID does not exist" Apr 22 15:38:04.049955 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.049937 2577 scope.go:117] "RemoveContainer" containerID="21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e" Apr 22 15:38:04.050153 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:38:04.050139 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e\": container with ID starting with 21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e not found: ID does not exist" containerID="21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e" Apr 22 15:38:04.050207 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.050156 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e"} err="failed to get container status \"21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e\": rpc error: code = NotFound desc = could not find container \"21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e\": container with ID starting with 21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e not found: ID does not exist" Apr 22 15:38:04.050207 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.050168 2577 scope.go:117] "RemoveContainer" containerID="cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f" Apr 22 15:38:04.050403 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:38:04.050385 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f\": container with ID starting with cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f not found: ID does not exist" containerID="cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f" Apr 22 15:38:04.050468 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.050410 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f"} err="failed to get container status \"cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f\": rpc error: code = NotFound desc = could not find container \"cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f\": container with ID starting with cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f not found: ID does not exist" Apr 22 15:38:04.050468 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.050433 2577 scope.go:117] "RemoveContainer" containerID="3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9" Apr 22 15:38:04.050731 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:38:04.050684 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9\": container with ID starting with 3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9 not found: ID does not exist" containerID="3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9" Apr 22 15:38:04.050731 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.050700 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9"} err="failed to get container status \"3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9\": rpc error: code = NotFound desc = could not find container \"3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9\": container with ID starting with 3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9 not found: ID does not exist" Apr 22 15:38:04.050731 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.050712 2577 scope.go:117] "RemoveContainer" containerID="08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db" Apr 22 15:38:04.050971 ip-10-0-143-30 kubenswrapper[2577]: E0422 15:38:04.050951 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db\": container with ID starting with 08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db not found: ID does not exist" containerID="08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db" Apr 22 15:38:04.051015 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.050976 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db"} err="failed to get container status \"08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db\": rpc error: code = NotFound desc = could not find container \"08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db\": container with ID starting with 08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db not found: ID does not exist" Apr 22 15:38:04.051015 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.050990 2577 scope.go:117] "RemoveContainer" containerID="3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c" Apr 22 15:38:04.051231 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.051204 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c"} err="failed to get container status \"3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c\": rpc error: code = NotFound desc = could not find container \"3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c\": container with ID starting with 3924015e11a05bc1c4f27d464d3667e2438c2178716ee31bf1ef4e18d428bc9c not found: ID does not exist" Apr 22 15:38:04.051275 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.051236 2577 scope.go:117] "RemoveContainer" containerID="bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689" Apr 22 15:38:04.051469 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.051452 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689"} err="failed to get container status \"bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689\": rpc error: code = NotFound desc = could not find container \"bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689\": container with ID starting with bd5f6210bd7f2431634176424aab8a24c769f82bd0e89cf86262e33194f81689 not found: ID does not exist" Apr 22 15:38:04.051516 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.051469 2577 scope.go:117] "RemoveContainer" containerID="6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec" Apr 22 15:38:04.051689 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.051673 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec"} err="failed to get container status \"6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec\": rpc error: code = NotFound desc = could not find container \"6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec\": container with ID starting with 6c0e15e83bab2fd3fff045dc387e3918f2097adff0ebe61373119873959d04ec not found: ID does not exist" Apr 22 15:38:04.051740 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.051689 2577 scope.go:117] "RemoveContainer" containerID="21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e" Apr 22 15:38:04.051885 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.051861 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e"} err="failed to get container status \"21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e\": rpc error: code = NotFound desc = could not find container \"21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e\": container with ID starting with 21a692a8331dd159800ca8845920ee445c98e17284698baee74c4d0372fd677e not found: ID does not exist" Apr 22 15:38:04.051959 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.051888 2577 scope.go:117] "RemoveContainer" containerID="cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f" Apr 22 15:38:04.052143 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.052122 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f"} err="failed to get container status \"cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f\": rpc error: code = NotFound desc = could not find container \"cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f\": container with ID starting with cf0f5b47771a09dd3af664506dd145c7f4b172fd5aef0cb758ea8d69b499314f not found: ID does not exist" Apr 22 15:38:04.052182 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.052147 2577 scope.go:117] "RemoveContainer" containerID="3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9" Apr 22 15:38:04.052369 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.052352 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9"} err="failed to get container status \"3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9\": rpc error: code = NotFound desc = could not find container \"3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9\": container with ID starting with 3f81ad204072435a897bd0a3f61415c23ad519c23be6377f5d29372a085e00e9 not found: ID does not exist" Apr 22 15:38:04.052409 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.052370 2577 scope.go:117] "RemoveContainer" containerID="08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db" Apr 22 15:38:04.052585 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.052568 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db"} err="failed to get container status \"08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db\": rpc error: code = NotFound desc = could not find container \"08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db\": container with ID starting with 08651a0cb995002d5668e83ded611da533829ecc77b6f8964e0e8cbde7e019db not found: ID does not exist" Apr 22 15:38:04.114393 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114372 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8040e9a9-11fe-47b1-9008-0b52af51bfc0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114489 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-web-config\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114489 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114424 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8040e9a9-11fe-47b1-9008-0b52af51bfc0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114489 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114439 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8040e9a9-11fe-47b1-9008-0b52af51bfc0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114489 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114671 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114569 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8040e9a9-11fe-47b1-9008-0b52af51bfc0-config-out\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114671 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8040e9a9-11fe-47b1-9008-0b52af51bfc0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114671 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-config-volume\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114671 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114853 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114853 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgxl6\" (UniqueName: \"kubernetes.io/projected/8040e9a9-11fe-47b1-9008-0b52af51bfc0-kube-api-access-sgxl6\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114853 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114853 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.114853 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.114794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8040e9a9-11fe-47b1-9008-0b52af51bfc0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.115340 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.115317 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8040e9a9-11fe-47b1-9008-0b52af51bfc0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.116140 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.115658 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8040e9a9-11fe-47b1-9008-0b52af51bfc0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.118052 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.118032 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-web-config\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.118498 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.118461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8040e9a9-11fe-47b1-9008-0b52af51bfc0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.118581 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.118555 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8040e9a9-11fe-47b1-9008-0b52af51bfc0-config-out\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.118643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.118576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.118888 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.118845 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.118957 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.118920 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.119693 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.119677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-config-volume\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.120325 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.120305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.120401 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.120349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8040e9a9-11fe-47b1-9008-0b52af51bfc0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.122822 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.122801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgxl6\" (UniqueName: \"kubernetes.io/projected/8040e9a9-11fe-47b1-9008-0b52af51bfc0-kube-api-access-sgxl6\") pod \"alertmanager-main-0\" (UID: \"8040e9a9-11fe-47b1-9008-0b52af51bfc0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.220166 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.220135 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:38:04.352738 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.352714 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:38:04.355165 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:38:04.355139 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8040e9a9_11fe_47b1_9008_0b52af51bfc0.slice/crio-dcb1afe943941de90daaa7621c712fdfd5eff0332d557ed871ba597920b1c01a WatchSource:0}: Error finding container dcb1afe943941de90daaa7621c712fdfd5eff0332d557ed871ba597920b1c01a: Status 404 returned error can't find the container with id dcb1afe943941de90daaa7621c712fdfd5eff0332d557ed871ba597920b1c01a Apr 22 15:38:04.855922 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.855867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zzcmv" event={"ID":"f7958036-9067-47bb-91dc-dc565feb289a","Type":"ContainerStarted","Data":"f8e37d1973ef64e0edd22ab5a4de7a533409922e0df002c0b1c326af8c4fa359"} Apr 22 15:38:04.855922 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.855924 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zzcmv" event={"ID":"f7958036-9067-47bb-91dc-dc565feb289a","Type":"ContainerStarted","Data":"303704b095aae495deb6ebd8778f82a60168fd6984658b58e152ae9c5236e88e"} Apr 22 15:38:04.857852 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.857829 2577 generic.go:358] "Generic (PLEG): container finished" podID="8040e9a9-11fe-47b1-9008-0b52af51bfc0" containerID="77129631bac4a1326dfc8f44102209a860f47578e368200402a260e3a2ba1d5a" exitCode=0 Apr 22 15:38:04.857994 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.857858 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8040e9a9-11fe-47b1-9008-0b52af51bfc0","Type":"ContainerDied","Data":"77129631bac4a1326dfc8f44102209a860f47578e368200402a260e3a2ba1d5a"} Apr 22 15:38:04.857994 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.857875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8040e9a9-11fe-47b1-9008-0b52af51bfc0","Type":"ContainerStarted","Data":"dcb1afe943941de90daaa7621c712fdfd5eff0332d557ed871ba597920b1c01a"} Apr 22 15:38:04.872261 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:04.872213 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zzcmv" podStartSLOduration=252.951512305 podStartE2EDuration="4m13.87219766s" podCreationTimestamp="2026-04-22 15:33:51 +0000 UTC" firstStartedPulling="2026-04-22 15:38:03.123434316 +0000 UTC m=+252.592985092" lastFinishedPulling="2026-04-22 15:38:04.044119665 +0000 UTC m=+253.513670447" observedRunningTime="2026-04-22 15:38:04.871623915 +0000 UTC m=+254.341174717" watchObservedRunningTime="2026-04-22 15:38:04.87219766 +0000 UTC m=+254.341748460" Apr 22 15:38:05.085202 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.085173 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9379052-cac4-4c31-819f-c98344e62728" path="/var/lib/kubelet/pods/b9379052-cac4-4c31-819f-c98344e62728/volumes" Apr 22 15:38:05.817429 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.817403 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5bfcdf575-k48q8"] Apr 22 15:38:05.820966 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.820948 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:05.823503 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.823483 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 15:38:05.823630 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.823579 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 15:38:05.823809 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.823795 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-p9sfv\"" Apr 22 15:38:05.824039 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.824024 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 15:38:05.824090 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.824024 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 15:38:05.825088 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.825070 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 15:38:05.830097 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.830075 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 15:38:05.833801 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.833779 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5bfcdf575-k48q8"] Apr 22 15:38:05.864048 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.864014 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8040e9a9-11fe-47b1-9008-0b52af51bfc0","Type":"ContainerStarted","Data":"a65068e13a860db757bf08d1239432141d1472bca60cd2126a7ed18284485df2"} Apr 22 15:38:05.864048 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.864047 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8040e9a9-11fe-47b1-9008-0b52af51bfc0","Type":"ContainerStarted","Data":"8c875ad46a87cd4bc14048028cd3dc7f7f9bb0c9f361b97bc64c7d94b03be7ea"} Apr 22 15:38:05.864410 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.864057 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8040e9a9-11fe-47b1-9008-0b52af51bfc0","Type":"ContainerStarted","Data":"3a3019af40696946535ef54e3e7c44fbb7ceb61888fa849744b7077b5428fb5b"} Apr 22 15:38:05.864410 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.864070 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8040e9a9-11fe-47b1-9008-0b52af51bfc0","Type":"ContainerStarted","Data":"b98f2294c5241a016ad110c836378dcfaae244126257cf07c4df431e2087c519"} Apr 22 15:38:05.864410 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.864082 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8040e9a9-11fe-47b1-9008-0b52af51bfc0","Type":"ContainerStarted","Data":"7f0714c97f77599d4ae009c6e09ad82df3a25241d1a4afc7de54bc5f44c79a0a"} Apr 22 15:38:05.864410 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.864091 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8040e9a9-11fe-47b1-9008-0b52af51bfc0","Type":"ContainerStarted","Data":"4b611ebc00f07cb6dd00c2e3180438d08d25960edf2d38d5bc511ce64de2c844"} Apr 22 15:38:05.895743 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.895708 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.895695808 podStartE2EDuration="2.895695808s" podCreationTimestamp="2026-04-22 15:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:38:05.893217914 +0000 UTC m=+255.362768724" watchObservedRunningTime="2026-04-22 15:38:05.895695808 +0000 UTC m=+255.365246607" Apr 22 15:38:05.930759 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.930736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea023f6d-4e9d-4614-96f2-d0fe9f050310-serving-certs-ca-bundle\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:05.930853 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.930763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-secret-telemeter-client\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:05.930853 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.930782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea023f6d-4e9d-4614-96f2-d0fe9f050310-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:05.930853 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.930842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-telemeter-client-tls\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:05.931027 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.930970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:05.931098 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.931074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-federate-client-tls\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:05.931303 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.931281 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea023f6d-4e9d-4614-96f2-d0fe9f050310-metrics-client-ca\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:05.931358 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:05.931321 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8dpm\" (UniqueName: \"kubernetes.io/projected/ea023f6d-4e9d-4614-96f2-d0fe9f050310-kube-api-access-d8dpm\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.032040 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-federate-client-tls\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.032123 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea023f6d-4e9d-4614-96f2-d0fe9f050310-metrics-client-ca\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.032123 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8dpm\" (UniqueName: \"kubernetes.io/projected/ea023f6d-4e9d-4614-96f2-d0fe9f050310-kube-api-access-d8dpm\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.032123 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea023f6d-4e9d-4614-96f2-d0fe9f050310-serving-certs-ca-bundle\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.032123 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-secret-telemeter-client\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.032279 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea023f6d-4e9d-4614-96f2-d0fe9f050310-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.032279 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-telemeter-client-tls\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.032279 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.033029 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea023f6d-4e9d-4614-96f2-d0fe9f050310-metrics-client-ca\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.033118 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.032982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea023f6d-4e9d-4614-96f2-d0fe9f050310-serving-certs-ca-bundle\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.033346 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.033323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea023f6d-4e9d-4614-96f2-d0fe9f050310-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.034726 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.034704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-federate-client-tls\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.034788 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.034774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-secret-telemeter-client\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.034921 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.034886 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-telemeter-client-tls\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.034958 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.034923 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea023f6d-4e9d-4614-96f2-d0fe9f050310-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.039769 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.039752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8dpm\" (UniqueName: \"kubernetes.io/projected/ea023f6d-4e9d-4614-96f2-d0fe9f050310-kube-api-access-d8dpm\") pod \"telemeter-client-5bfcdf575-k48q8\" (UID: \"ea023f6d-4e9d-4614-96f2-d0fe9f050310\") " pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.130752 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.130696 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" Apr 22 15:38:06.270790 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.270766 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5bfcdf575-k48q8"] Apr 22 15:38:06.273282 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:38:06.273255 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea023f6d_4e9d_4614_96f2_d0fe9f050310.slice/crio-58b41c14fa1ff25db2692fef751a43b1f95ea53132c447152f35ba9b3a1893ac WatchSource:0}: Error finding container 58b41c14fa1ff25db2692fef751a43b1f95ea53132c447152f35ba9b3a1893ac: Status 404 returned error can't find the container with id 58b41c14fa1ff25db2692fef751a43b1f95ea53132c447152f35ba9b3a1893ac Apr 22 15:38:06.868799 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:06.868765 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" event={"ID":"ea023f6d-4e9d-4614-96f2-d0fe9f050310","Type":"ContainerStarted","Data":"58b41c14fa1ff25db2692fef751a43b1f95ea53132c447152f35ba9b3a1893ac"} Apr 22 15:38:08.876481 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:08.876403 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" event={"ID":"ea023f6d-4e9d-4614-96f2-d0fe9f050310","Type":"ContainerStarted","Data":"878ba6a13f062afbdca1d88625dc310fa0cf8b094f4ec6a110c789601bdd8bf1"} Apr 22 15:38:08.876481 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:08.876446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" event={"ID":"ea023f6d-4e9d-4614-96f2-d0fe9f050310","Type":"ContainerStarted","Data":"c842422ca9bc9f7167cc582eb956db8e6723225ff19b2f3670fe48429ec021a9"} Apr 22 15:38:08.876481 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:08.876461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" event={"ID":"ea023f6d-4e9d-4614-96f2-d0fe9f050310","Type":"ContainerStarted","Data":"035ab86fcac4f8a303762085147398a93841416405cc5b14cb6631ba6b9444d5"} Apr 22 15:38:08.899504 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:08.899453 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5bfcdf575-k48q8" podStartSLOduration=1.679710117 podStartE2EDuration="3.899436869s" podCreationTimestamp="2026-04-22 15:38:05 +0000 UTC" firstStartedPulling="2026-04-22 15:38:06.2751572 +0000 UTC m=+255.744707978" lastFinishedPulling="2026-04-22 15:38:08.494883953 +0000 UTC m=+257.964434730" observedRunningTime="2026-04-22 15:38:08.89799755 +0000 UTC m=+258.367548350" watchObservedRunningTime="2026-04-22 15:38:08.899436869 +0000 UTC m=+258.368987670" Apr 22 15:38:50.983912 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:38:50.983876 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 15:40:18.925229 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:18.925147 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks"] Apr 22 15:40:18.927196 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:18.927181 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:18.930118 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:18.930082 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 15:40:18.930357 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:18.930343 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 15:40:18.930429 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:18.930346 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 15:40:18.931023 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:18.931001 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 15:40:18.935653 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:18.935630 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks"] Apr 22 15:40:19.020274 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.020247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfp2n\" (UniqueName: \"kubernetes.io/projected/d8f177ba-4ab9-4f45-9aee-465b77854a0f-kube-api-access-cfp2n\") pod \"klusterlet-addon-workmgr-c84fc88f4-mc5ks\" (UID: \"d8f177ba-4ab9-4f45-9aee-465b77854a0f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.020412 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.020321 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d8f177ba-4ab9-4f45-9aee-465b77854a0f-klusterlet-config\") pod \"klusterlet-addon-workmgr-c84fc88f4-mc5ks\" (UID: \"d8f177ba-4ab9-4f45-9aee-465b77854a0f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.020412 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.020360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d8f177ba-4ab9-4f45-9aee-465b77854a0f-tmp\") pod \"klusterlet-addon-workmgr-c84fc88f4-mc5ks\" (UID: \"d8f177ba-4ab9-4f45-9aee-465b77854a0f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.121552 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.121529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d8f177ba-4ab9-4f45-9aee-465b77854a0f-tmp\") pod \"klusterlet-addon-workmgr-c84fc88f4-mc5ks\" (UID: \"d8f177ba-4ab9-4f45-9aee-465b77854a0f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.121671 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.121564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfp2n\" (UniqueName: \"kubernetes.io/projected/d8f177ba-4ab9-4f45-9aee-465b77854a0f-kube-api-access-cfp2n\") pod \"klusterlet-addon-workmgr-c84fc88f4-mc5ks\" (UID: \"d8f177ba-4ab9-4f45-9aee-465b77854a0f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.121671 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.121652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d8f177ba-4ab9-4f45-9aee-465b77854a0f-klusterlet-config\") pod \"klusterlet-addon-workmgr-c84fc88f4-mc5ks\" (UID: \"d8f177ba-4ab9-4f45-9aee-465b77854a0f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.121991 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.121969 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d8f177ba-4ab9-4f45-9aee-465b77854a0f-tmp\") pod \"klusterlet-addon-workmgr-c84fc88f4-mc5ks\" (UID: \"d8f177ba-4ab9-4f45-9aee-465b77854a0f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.124165 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.124148 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d8f177ba-4ab9-4f45-9aee-465b77854a0f-klusterlet-config\") pod \"klusterlet-addon-workmgr-c84fc88f4-mc5ks\" (UID: \"d8f177ba-4ab9-4f45-9aee-465b77854a0f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.129068 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.129038 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfp2n\" (UniqueName: \"kubernetes.io/projected/d8f177ba-4ab9-4f45-9aee-465b77854a0f-kube-api-access-cfp2n\") pod \"klusterlet-addon-workmgr-c84fc88f4-mc5ks\" (UID: \"d8f177ba-4ab9-4f45-9aee-465b77854a0f\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.236241 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.236212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:19.363661 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.363538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks"] Apr 22 15:40:19.366575 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:40:19.366542 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f177ba_4ab9_4f45_9aee_465b77854a0f.slice/crio-1aff2b7b9da4fde4d3872a7b7217568475ede48b6c8a27a9ba77a9880a32ffa3 WatchSource:0}: Error finding container 1aff2b7b9da4fde4d3872a7b7217568475ede48b6c8a27a9ba77a9880a32ffa3: Status 404 returned error can't find the container with id 1aff2b7b9da4fde4d3872a7b7217568475ede48b6c8a27a9ba77a9880a32ffa3 Apr 22 15:40:19.371603 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:19.371581 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:40:20.241122 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:20.241086 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" event={"ID":"d8f177ba-4ab9-4f45-9aee-465b77854a0f","Type":"ContainerStarted","Data":"1aff2b7b9da4fde4d3872a7b7217568475ede48b6c8a27a9ba77a9880a32ffa3"} Apr 22 15:40:24.256171 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:24.256129 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" event={"ID":"d8f177ba-4ab9-4f45-9aee-465b77854a0f","Type":"ContainerStarted","Data":"6240d7bb9c6de549ba587a121b841ccc85c8d2cbec361b1bd03c5b53182667c6"} Apr 22 15:40:24.256605 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:24.256339 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:24.257993 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:24.257974 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" Apr 22 15:40:24.274052 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:40:24.273997 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-c84fc88f4-mc5ks" podStartSLOduration=2.264392756 podStartE2EDuration="6.273986175s" podCreationTimestamp="2026-04-22 15:40:18 +0000 UTC" firstStartedPulling="2026-04-22 15:40:19.371786344 +0000 UTC m=+388.841337142" lastFinishedPulling="2026-04-22 15:40:23.38137978 +0000 UTC m=+392.850930561" observedRunningTime="2026-04-22 15:40:24.272484065 +0000 UTC m=+393.742034865" watchObservedRunningTime="2026-04-22 15:40:24.273986175 +0000 UTC m=+393.743537008" Apr 22 15:41:01.148029 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.147992 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj"] Apr 22 15:41:01.155440 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.155415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.157946 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.157893 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 15:41:01.158091 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.158001 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 15:41:01.158884 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.158864 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tp2s9\"" Apr 22 15:41:01.163243 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.163220 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj"] Apr 22 15:41:01.233381 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.233344 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mxk6\" (UniqueName: \"kubernetes.io/projected/cda92849-cc61-4956-8840-37ec806d1ad4-kube-api-access-5mxk6\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.233381 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.233397 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.233624 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.233412 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.334073 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.334030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mxk6\" (UniqueName: \"kubernetes.io/projected/cda92849-cc61-4956-8840-37ec806d1ad4-kube-api-access-5mxk6\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.334258 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.334119 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.334258 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.334153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.334565 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.334547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.334601 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.334563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.343630 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.343604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mxk6\" (UniqueName: \"kubernetes.io/projected/cda92849-cc61-4956-8840-37ec806d1ad4-kube-api-access-5mxk6\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.465253 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.465164 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:01.610361 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:01.610167 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj"] Apr 22 15:41:01.613077 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:41:01.613036 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcda92849_cc61_4956_8840_37ec806d1ad4.slice/crio-535ce5d5ab40e4bc67850e4c8c5aac42bebabe693d64793afa51114a59a36093 WatchSource:0}: Error finding container 535ce5d5ab40e4bc67850e4c8c5aac42bebabe693d64793afa51114a59a36093: Status 404 returned error can't find the container with id 535ce5d5ab40e4bc67850e4c8c5aac42bebabe693d64793afa51114a59a36093 Apr 22 15:41:02.365462 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:02.365425 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" event={"ID":"cda92849-cc61-4956-8840-37ec806d1ad4","Type":"ContainerStarted","Data":"535ce5d5ab40e4bc67850e4c8c5aac42bebabe693d64793afa51114a59a36093"} Apr 22 15:41:07.381501 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:07.381460 2577 generic.go:358] "Generic (PLEG): container finished" podID="cda92849-cc61-4956-8840-37ec806d1ad4" containerID="26cc5ec5a629715a88048fc340dca423ef2883d0e698ca57cbbeebd2fed47b76" exitCode=0 Apr 22 15:41:07.381881 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:07.381553 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" event={"ID":"cda92849-cc61-4956-8840-37ec806d1ad4","Type":"ContainerDied","Data":"26cc5ec5a629715a88048fc340dca423ef2883d0e698ca57cbbeebd2fed47b76"} Apr 22 15:41:10.392512 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:10.392476 2577 generic.go:358] "Generic (PLEG): container finished" podID="cda92849-cc61-4956-8840-37ec806d1ad4" containerID="1684a906410effb25423b7f4887619066d0e29daa715c8e47b5e799a2d97aa9e" exitCode=0 Apr 22 15:41:10.392916 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:10.392544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" event={"ID":"cda92849-cc61-4956-8840-37ec806d1ad4","Type":"ContainerDied","Data":"1684a906410effb25423b7f4887619066d0e29daa715c8e47b5e799a2d97aa9e"} Apr 22 15:41:17.413816 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:17.413776 2577 generic.go:358] "Generic (PLEG): container finished" podID="cda92849-cc61-4956-8840-37ec806d1ad4" containerID="9786559d59fe16acdea72b6d74526a10d15930f4d735fe314caad06c6f95b2d4" exitCode=0 Apr 22 15:41:17.414226 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:17.413849 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" event={"ID":"cda92849-cc61-4956-8840-37ec806d1ad4","Type":"ContainerDied","Data":"9786559d59fe16acdea72b6d74526a10d15930f4d735fe314caad06c6f95b2d4"} Apr 22 15:41:18.539522 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.539500 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:18.691007 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.690918 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mxk6\" (UniqueName: \"kubernetes.io/projected/cda92849-cc61-4956-8840-37ec806d1ad4-kube-api-access-5mxk6\") pod \"cda92849-cc61-4956-8840-37ec806d1ad4\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " Apr 22 15:41:18.691188 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.691018 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-bundle\") pod \"cda92849-cc61-4956-8840-37ec806d1ad4\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " Apr 22 15:41:18.691188 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.691062 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-util\") pod \"cda92849-cc61-4956-8840-37ec806d1ad4\" (UID: \"cda92849-cc61-4956-8840-37ec806d1ad4\") " Apr 22 15:41:18.691643 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.691615 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-bundle" (OuterVolumeSpecName: "bundle") pod "cda92849-cc61-4956-8840-37ec806d1ad4" (UID: "cda92849-cc61-4956-8840-37ec806d1ad4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:41:18.693343 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.693313 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda92849-cc61-4956-8840-37ec806d1ad4-kube-api-access-5mxk6" (OuterVolumeSpecName: "kube-api-access-5mxk6") pod "cda92849-cc61-4956-8840-37ec806d1ad4" (UID: "cda92849-cc61-4956-8840-37ec806d1ad4"). InnerVolumeSpecName "kube-api-access-5mxk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:41:18.695402 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.695380 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-util" (OuterVolumeSpecName: "util") pod "cda92849-cc61-4956-8840-37ec806d1ad4" (UID: "cda92849-cc61-4956-8840-37ec806d1ad4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:41:18.791919 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.791867 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-bundle\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:41:18.791919 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.791916 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cda92849-cc61-4956-8840-37ec806d1ad4-util\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:41:18.791919 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:18.791928 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5mxk6\" (UniqueName: \"kubernetes.io/projected/cda92849-cc61-4956-8840-37ec806d1ad4-kube-api-access-5mxk6\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:41:19.420934 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:19.420822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" event={"ID":"cda92849-cc61-4956-8840-37ec806d1ad4","Type":"ContainerDied","Data":"535ce5d5ab40e4bc67850e4c8c5aac42bebabe693d64793afa51114a59a36093"} Apr 22 15:41:19.420934 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:19.420858 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="535ce5d5ab40e4bc67850e4c8c5aac42bebabe693d64793afa51114a59a36093" Apr 22 15:41:19.420934 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:19.420892 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d6wmgj" Apr 22 15:41:24.162567 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.162534 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df"] Apr 22 15:41:24.162999 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.162844 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda92849-cc61-4956-8840-37ec806d1ad4" containerName="util" Apr 22 15:41:24.162999 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.162856 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda92849-cc61-4956-8840-37ec806d1ad4" containerName="util" Apr 22 15:41:24.162999 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.162870 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda92849-cc61-4956-8840-37ec806d1ad4" containerName="pull" Apr 22 15:41:24.162999 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.162876 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda92849-cc61-4956-8840-37ec806d1ad4" containerName="pull" Apr 22 15:41:24.162999 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.162888 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cda92849-cc61-4956-8840-37ec806d1ad4" containerName="extract" Apr 22 15:41:24.162999 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.162893 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda92849-cc61-4956-8840-37ec806d1ad4" containerName="extract" Apr 22 15:41:24.162999 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.162958 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cda92849-cc61-4956-8840-37ec806d1ad4" containerName="extract" Apr 22 15:41:24.214908 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.214876 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df"] Apr 22 15:41:24.215057 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.215021 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" Apr 22 15:41:24.217676 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.217656 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 15:41:24.217809 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.217685 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:41:24.218270 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.218257 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-ccnbl\"" Apr 22 15:41:24.337237 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.337192 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c725eabb-346e-47eb-854d-97a0c8724019-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-nh9df\" (UID: \"c725eabb-346e-47eb-854d-97a0c8724019\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" Apr 22 15:41:24.337237 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.337243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hts9t\" (UniqueName: \"kubernetes.io/projected/c725eabb-346e-47eb-854d-97a0c8724019-kube-api-access-hts9t\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-nh9df\" (UID: \"c725eabb-346e-47eb-854d-97a0c8724019\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" Apr 22 15:41:24.438475 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.438385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c725eabb-346e-47eb-854d-97a0c8724019-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-nh9df\" (UID: \"c725eabb-346e-47eb-854d-97a0c8724019\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" Apr 22 15:41:24.438475 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.438428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hts9t\" (UniqueName: \"kubernetes.io/projected/c725eabb-346e-47eb-854d-97a0c8724019-kube-api-access-hts9t\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-nh9df\" (UID: \"c725eabb-346e-47eb-854d-97a0c8724019\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" Apr 22 15:41:24.438748 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.438731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c725eabb-346e-47eb-854d-97a0c8724019-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-nh9df\" (UID: \"c725eabb-346e-47eb-854d-97a0c8724019\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" Apr 22 15:41:24.448046 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.448018 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hts9t\" (UniqueName: \"kubernetes.io/projected/c725eabb-346e-47eb-854d-97a0c8724019-kube-api-access-hts9t\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-nh9df\" (UID: \"c725eabb-346e-47eb-854d-97a0c8724019\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" Apr 22 15:41:24.524492 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.524450 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" Apr 22 15:41:24.657017 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:24.656993 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df"] Apr 22 15:41:24.660427 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:41:24.660398 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc725eabb_346e_47eb_854d_97a0c8724019.slice/crio-dcdf96b7625026a1da9863795b0a898b6e4b8a336d89bfcc5f25ed03bc3b3dc5 WatchSource:0}: Error finding container dcdf96b7625026a1da9863795b0a898b6e4b8a336d89bfcc5f25ed03bc3b3dc5: Status 404 returned error can't find the container with id dcdf96b7625026a1da9863795b0a898b6e4b8a336d89bfcc5f25ed03bc3b3dc5 Apr 22 15:41:25.440310 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:25.440275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" event={"ID":"c725eabb-346e-47eb-854d-97a0c8724019","Type":"ContainerStarted","Data":"dcdf96b7625026a1da9863795b0a898b6e4b8a336d89bfcc5f25ed03bc3b3dc5"} Apr 22 15:41:27.448739 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:27.448696 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" event={"ID":"c725eabb-346e-47eb-854d-97a0c8724019","Type":"ContainerStarted","Data":"3b6627fb3280bf0f0ca790b499dc89b4e17ce70dc80c64b376d18e74a43b115c"} Apr 22 15:41:27.469441 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:27.469371 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-nh9df" podStartSLOduration=1.183146757 podStartE2EDuration="3.469350356s" podCreationTimestamp="2026-04-22 15:41:24 +0000 UTC" firstStartedPulling="2026-04-22 15:41:24.663453919 +0000 UTC m=+454.133004695" lastFinishedPulling="2026-04-22 15:41:26.949657515 +0000 UTC m=+456.419208294" observedRunningTime="2026-04-22 15:41:27.467253175 +0000 UTC m=+456.936803972" watchObservedRunningTime="2026-04-22 15:41:27.469350356 +0000 UTC m=+456.938901160" Apr 22 15:41:30.624713 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.624679 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-mr44x"] Apr 22 15:41:30.628382 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.628364 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:30.630770 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.630746 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-pbbl7\"" Apr 22 15:41:30.631486 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.631467 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 15:41:30.631486 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.631483 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 15:41:30.638571 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.638549 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-mr44x"] Apr 22 15:41:30.794489 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.794449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8mk8\" (UniqueName: \"kubernetes.io/projected/12fed035-21d1-4b42-b7c0-d9da01550a04-kube-api-access-p8mk8\") pod \"cert-manager-webhook-587ccfb98-mr44x\" (UID: \"12fed035-21d1-4b42-b7c0-d9da01550a04\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:30.794489 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.794485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12fed035-21d1-4b42-b7c0-d9da01550a04-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-mr44x\" (UID: \"12fed035-21d1-4b42-b7c0-d9da01550a04\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:30.895833 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.895742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8mk8\" (UniqueName: \"kubernetes.io/projected/12fed035-21d1-4b42-b7c0-d9da01550a04-kube-api-access-p8mk8\") pod \"cert-manager-webhook-587ccfb98-mr44x\" (UID: \"12fed035-21d1-4b42-b7c0-d9da01550a04\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:30.895833 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.895778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12fed035-21d1-4b42-b7c0-d9da01550a04-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-mr44x\" (UID: \"12fed035-21d1-4b42-b7c0-d9da01550a04\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:30.913850 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.913814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/12fed035-21d1-4b42-b7c0-d9da01550a04-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-mr44x\" (UID: \"12fed035-21d1-4b42-b7c0-d9da01550a04\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:30.914015 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.913932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8mk8\" (UniqueName: \"kubernetes.io/projected/12fed035-21d1-4b42-b7c0-d9da01550a04-kube-api-access-p8mk8\") pod \"cert-manager-webhook-587ccfb98-mr44x\" (UID: \"12fed035-21d1-4b42-b7c0-d9da01550a04\") " pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:30.949654 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:30.949620 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:31.080694 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:31.080658 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-mr44x"] Apr 22 15:41:31.083767 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:41:31.083736 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12fed035_21d1_4b42_b7c0_d9da01550a04.slice/crio-4af17e486fcb4cd43735b7775674c15a64df902423f23478ae0bc9a3a5009ae1 WatchSource:0}: Error finding container 4af17e486fcb4cd43735b7775674c15a64df902423f23478ae0bc9a3a5009ae1: Status 404 returned error can't find the container with id 4af17e486fcb4cd43735b7775674c15a64df902423f23478ae0bc9a3a5009ae1 Apr 22 15:41:31.462867 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:31.462833 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" event={"ID":"12fed035-21d1-4b42-b7c0-d9da01550a04","Type":"ContainerStarted","Data":"4af17e486fcb4cd43735b7775674c15a64df902423f23478ae0bc9a3a5009ae1"} Apr 22 15:41:34.475734 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:34.475700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" event={"ID":"12fed035-21d1-4b42-b7c0-d9da01550a04","Type":"ContainerStarted","Data":"e5bae2713721fbb308857adde31a5309ee5b0839117286ef17ad228f8f8db171"} Apr 22 15:41:34.476196 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:34.475759 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:34.492921 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:34.492848 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" podStartSLOduration=1.949427588 podStartE2EDuration="4.492832018s" podCreationTimestamp="2026-04-22 15:41:30 +0000 UTC" firstStartedPulling="2026-04-22 15:41:31.086013407 +0000 UTC m=+460.555564184" lastFinishedPulling="2026-04-22 15:41:33.629417833 +0000 UTC m=+463.098968614" observedRunningTime="2026-04-22 15:41:34.490205354 +0000 UTC m=+463.959756150" watchObservedRunningTime="2026-04-22 15:41:34.492832018 +0000 UTC m=+463.962382817" Apr 22 15:41:35.410257 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.410221 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-qpxn2"] Apr 22 15:41:35.413458 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.413442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" Apr 22 15:41:35.415855 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.415833 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-9t7rm\"" Apr 22 15:41:35.423810 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.423785 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-qpxn2"] Apr 22 15:41:35.429507 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.429485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7kvj\" (UniqueName: \"kubernetes.io/projected/e47798af-9b6a-461b-bf19-4f01a58a2719-kube-api-access-s7kvj\") pod \"cert-manager-cainjector-68b757865b-qpxn2\" (UID: \"e47798af-9b6a-461b-bf19-4f01a58a2719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" Apr 22 15:41:35.429596 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.429523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e47798af-9b6a-461b-bf19-4f01a58a2719-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-qpxn2\" (UID: \"e47798af-9b6a-461b-bf19-4f01a58a2719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" Apr 22 15:41:35.529876 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.529842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7kvj\" (UniqueName: \"kubernetes.io/projected/e47798af-9b6a-461b-bf19-4f01a58a2719-kube-api-access-s7kvj\") pod \"cert-manager-cainjector-68b757865b-qpxn2\" (UID: \"e47798af-9b6a-461b-bf19-4f01a58a2719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" Apr 22 15:41:35.529876 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.529883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e47798af-9b6a-461b-bf19-4f01a58a2719-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-qpxn2\" (UID: \"e47798af-9b6a-461b-bf19-4f01a58a2719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" Apr 22 15:41:35.538533 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.538507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e47798af-9b6a-461b-bf19-4f01a58a2719-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-qpxn2\" (UID: \"e47798af-9b6a-461b-bf19-4f01a58a2719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" Apr 22 15:41:35.538818 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.538795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7kvj\" (UniqueName: \"kubernetes.io/projected/e47798af-9b6a-461b-bf19-4f01a58a2719-kube-api-access-s7kvj\") pod \"cert-manager-cainjector-68b757865b-qpxn2\" (UID: \"e47798af-9b6a-461b-bf19-4f01a58a2719\") " pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" Apr 22 15:41:35.722703 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.722608 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" Apr 22 15:41:35.851600 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:35.851566 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-qpxn2"] Apr 22 15:41:35.854759 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:41:35.854727 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode47798af_9b6a_461b_bf19_4f01a58a2719.slice/crio-5e341e540a8b5c0a2ded5b3f44c76763d8ca094ca2e5d8c9996832f3e32b0c98 WatchSource:0}: Error finding container 5e341e540a8b5c0a2ded5b3f44c76763d8ca094ca2e5d8c9996832f3e32b0c98: Status 404 returned error can't find the container with id 5e341e540a8b5c0a2ded5b3f44c76763d8ca094ca2e5d8c9996832f3e32b0c98 Apr 22 15:41:36.483543 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:36.483508 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" event={"ID":"e47798af-9b6a-461b-bf19-4f01a58a2719","Type":"ContainerStarted","Data":"9b6d3c1e530637df4ed990d02572e7bca9d6de4ce842d294f4bf542907d55df4"} Apr 22 15:41:36.483543 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:36.483543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" event={"ID":"e47798af-9b6a-461b-bf19-4f01a58a2719","Type":"ContainerStarted","Data":"5e341e540a8b5c0a2ded5b3f44c76763d8ca094ca2e5d8c9996832f3e32b0c98"} Apr 22 15:41:36.501586 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:36.501538 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-qpxn2" podStartSLOduration=1.501524165 podStartE2EDuration="1.501524165s" podCreationTimestamp="2026-04-22 15:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:41:36.499338882 +0000 UTC m=+465.968889683" watchObservedRunningTime="2026-04-22 15:41:36.501524165 +0000 UTC m=+465.971074964" Apr 22 15:41:40.481974 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:40.481940 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-mr44x" Apr 22 15:41:55.014102 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.014066 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l"] Apr 22 15:41:55.017712 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.017694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.025613 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.025589 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 15:41:55.026070 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.026049 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 15:41:55.026188 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.026077 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tp2s9\"" Apr 22 15:41:55.032934 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.032888 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l"] Apr 22 15:41:55.075583 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.075553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.075747 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.075605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df79p\" (UniqueName: \"kubernetes.io/projected/bd39061a-733a-4df5-bf64-45e128e269d2-kube-api-access-df79p\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.075747 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.075669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.176717 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.176682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.176886 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.176740 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.176886 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.176778 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df79p\" (UniqueName: \"kubernetes.io/projected/bd39061a-733a-4df5-bf64-45e128e269d2-kube-api-access-df79p\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.177116 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.177099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.177200 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.177122 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.193102 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.193071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-df79p\" (UniqueName: \"kubernetes.io/projected/bd39061a-733a-4df5-bf64-45e128e269d2-kube-api-access-df79p\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.327475 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.327379 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:41:55.455799 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.455776 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l"] Apr 22 15:41:55.456786 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:41:55.456759 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd39061a_733a_4df5_bf64_45e128e269d2.slice/crio-191893c61ab2974bf1dcc3fa7551c0ce4572e7516d8bc6d559ce6a11678e168f WatchSource:0}: Error finding container 191893c61ab2974bf1dcc3fa7551c0ce4572e7516d8bc6d559ce6a11678e168f: Status 404 returned error can't find the container with id 191893c61ab2974bf1dcc3fa7551c0ce4572e7516d8bc6d559ce6a11678e168f Apr 22 15:41:55.547235 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.547199 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" event={"ID":"bd39061a-733a-4df5-bf64-45e128e269d2","Type":"ContainerStarted","Data":"55fd25974728dcccc72ca69b29df565c20e07f29012e63c309ba8c13c66f121e"} Apr 22 15:41:55.547235 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:55.547241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" event={"ID":"bd39061a-733a-4df5-bf64-45e128e269d2","Type":"ContainerStarted","Data":"191893c61ab2974bf1dcc3fa7551c0ce4572e7516d8bc6d559ce6a11678e168f"} Apr 22 15:41:56.551806 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:56.551771 2577 generic.go:358] "Generic (PLEG): container finished" podID="bd39061a-733a-4df5-bf64-45e128e269d2" containerID="55fd25974728dcccc72ca69b29df565c20e07f29012e63c309ba8c13c66f121e" exitCode=0 Apr 22 15:41:56.552228 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:56.551861 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" event={"ID":"bd39061a-733a-4df5-bf64-45e128e269d2","Type":"ContainerDied","Data":"55fd25974728dcccc72ca69b29df565c20e07f29012e63c309ba8c13c66f121e"} Apr 22 15:41:59.565328 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:59.565291 2577 generic.go:358] "Generic (PLEG): container finished" podID="bd39061a-733a-4df5-bf64-45e128e269d2" containerID="eee52f23006321c2fb6e77adf5987249117b282439ceb526e3484a1777a5d3ce" exitCode=0 Apr 22 15:41:59.565725 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:41:59.565375 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" event={"ID":"bd39061a-733a-4df5-bf64-45e128e269d2","Type":"ContainerDied","Data":"eee52f23006321c2fb6e77adf5987249117b282439ceb526e3484a1777a5d3ce"} Apr 22 15:42:00.570646 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:00.570605 2577 generic.go:358] "Generic (PLEG): container finished" podID="bd39061a-733a-4df5-bf64-45e128e269d2" containerID="7c905cd90c2f1fcb57a725de9d5a79efa1e3f571fa942f78557ce8ee367b1647" exitCode=0 Apr 22 15:42:00.570646 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:00.570647 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" event={"ID":"bd39061a-733a-4df5-bf64-45e128e269d2","Type":"ContainerDied","Data":"7c905cd90c2f1fcb57a725de9d5a79efa1e3f571fa942f78557ce8ee367b1647"} Apr 22 15:42:01.692953 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.692927 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:42:01.740123 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.740093 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-util\") pod \"bd39061a-733a-4df5-bf64-45e128e269d2\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " Apr 22 15:42:01.740123 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.740128 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df79p\" (UniqueName: \"kubernetes.io/projected/bd39061a-733a-4df5-bf64-45e128e269d2-kube-api-access-df79p\") pod \"bd39061a-733a-4df5-bf64-45e128e269d2\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " Apr 22 15:42:01.740374 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.740206 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-bundle\") pod \"bd39061a-733a-4df5-bf64-45e128e269d2\" (UID: \"bd39061a-733a-4df5-bf64-45e128e269d2\") " Apr 22 15:42:01.740654 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.740626 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-bundle" (OuterVolumeSpecName: "bundle") pod "bd39061a-733a-4df5-bf64-45e128e269d2" (UID: "bd39061a-733a-4df5-bf64-45e128e269d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:42:01.742453 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.742427 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd39061a-733a-4df5-bf64-45e128e269d2-kube-api-access-df79p" (OuterVolumeSpecName: "kube-api-access-df79p") pod "bd39061a-733a-4df5-bf64-45e128e269d2" (UID: "bd39061a-733a-4df5-bf64-45e128e269d2"). InnerVolumeSpecName "kube-api-access-df79p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:42:01.744530 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.744493 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-util" (OuterVolumeSpecName: "util") pod "bd39061a-733a-4df5-bf64-45e128e269d2" (UID: "bd39061a-733a-4df5-bf64-45e128e269d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:42:01.841273 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.841187 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-bundle\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:42:01.841273 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.841217 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bd39061a-733a-4df5-bf64-45e128e269d2-util\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:42:01.841273 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:01.841227 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-df79p\" (UniqueName: \"kubernetes.io/projected/bd39061a-733a-4df5-bf64-45e128e269d2-kube-api-access-df79p\") on node \"ip-10-0-143-30.ec2.internal\" DevicePath \"\"" Apr 22 15:42:02.578512 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:02.578474 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" event={"ID":"bd39061a-733a-4df5-bf64-45e128e269d2","Type":"ContainerDied","Data":"191893c61ab2974bf1dcc3fa7551c0ce4572e7516d8bc6d559ce6a11678e168f"} Apr 22 15:42:02.578512 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:02.578522 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="191893c61ab2974bf1dcc3fa7551c0ce4572e7516d8bc6d559ce6a11678e168f" Apr 22 15:42:02.578773 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:02.578487 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emw59l" Apr 22 15:42:08.653812 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.653774 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-mn45p"] Apr 22 15:42:08.654296 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.654127 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd39061a-733a-4df5-bf64-45e128e269d2" containerName="extract" Apr 22 15:42:08.654296 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.654141 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd39061a-733a-4df5-bf64-45e128e269d2" containerName="extract" Apr 22 15:42:08.654296 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.654175 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd39061a-733a-4df5-bf64-45e128e269d2" containerName="util" Apr 22 15:42:08.654296 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.654181 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd39061a-733a-4df5-bf64-45e128e269d2" containerName="util" Apr 22 15:42:08.654296 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.654188 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd39061a-733a-4df5-bf64-45e128e269d2" containerName="pull" Apr 22 15:42:08.654296 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.654194 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd39061a-733a-4df5-bf64-45e128e269d2" containerName="pull" Apr 22 15:42:08.654296 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.654244 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd39061a-733a-4df5-bf64-45e128e269d2" containerName="extract" Apr 22 15:42:08.657010 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.656992 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" Apr 22 15:42:08.665655 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.665638 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-rzpxz\"" Apr 22 15:42:08.667828 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.667812 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 22 15:42:08.667883 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.667857 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:42:08.680236 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.680202 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-mn45p"] Apr 22 15:42:08.804308 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.804258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d-tmp\") pod \"jobset-operator-747c5859c7-mn45p\" (UID: \"48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" Apr 22 15:42:08.804491 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.804323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qvt9\" (UniqueName: \"kubernetes.io/projected/48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d-kube-api-access-5qvt9\") pod \"jobset-operator-747c5859c7-mn45p\" (UID: \"48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" Apr 22 15:42:08.905848 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.905747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qvt9\" (UniqueName: \"kubernetes.io/projected/48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d-kube-api-access-5qvt9\") pod \"jobset-operator-747c5859c7-mn45p\" (UID: \"48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" Apr 22 15:42:08.905848 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.905837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d-tmp\") pod \"jobset-operator-747c5859c7-mn45p\" (UID: \"48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" Apr 22 15:42:08.906259 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.906239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d-tmp\") pod \"jobset-operator-747c5859c7-mn45p\" (UID: \"48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" Apr 22 15:42:08.942397 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.942364 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qvt9\" (UniqueName: \"kubernetes.io/projected/48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d-kube-api-access-5qvt9\") pod \"jobset-operator-747c5859c7-mn45p\" (UID: \"48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" Apr 22 15:42:08.966200 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:08.966160 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" Apr 22 15:42:09.112108 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:09.112044 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-mn45p"] Apr 22 15:42:09.114586 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:42:09.114556 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48b3cb8e_6a8f_4fb1_96ba_8f6b2fea700d.slice/crio-20f87509abc5c43987f834ea1b1bd9df1f0222f10f1ef08140d28a73f715a0ef WatchSource:0}: Error finding container 20f87509abc5c43987f834ea1b1bd9df1f0222f10f1ef08140d28a73f715a0ef: Status 404 returned error can't find the container with id 20f87509abc5c43987f834ea1b1bd9df1f0222f10f1ef08140d28a73f715a0ef Apr 22 15:42:09.604022 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:09.603986 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" event={"ID":"48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d","Type":"ContainerStarted","Data":"20f87509abc5c43987f834ea1b1bd9df1f0222f10f1ef08140d28a73f715a0ef"} Apr 22 15:42:14.620794 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:14.620722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" event={"ID":"48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d","Type":"ContainerStarted","Data":"68b40a145e669eabc7269bad155a1ce6c405999b1415799535327115f5c0f5d8"} Apr 22 15:42:14.642478 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:14.642416 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-mn45p" podStartSLOduration=1.3996750709999999 podStartE2EDuration="6.642397003s" podCreationTimestamp="2026-04-22 15:42:08 +0000 UTC" firstStartedPulling="2026-04-22 15:42:09.116047475 +0000 UTC m=+498.585598254" lastFinishedPulling="2026-04-22 15:42:14.358769406 +0000 UTC m=+503.828320186" observedRunningTime="2026-04-22 15:42:14.642015111 +0000 UTC m=+504.111565911" watchObservedRunningTime="2026-04-22 15:42:14.642397003 +0000 UTC m=+504.111947803" Apr 22 15:42:18.507473 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.507436 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c"] Apr 22 15:42:18.509762 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.509745 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.512724 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.512699 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 22 15:42:18.512841 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.512789 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 22 15:42:18.512841 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.512805 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 22 15:42:18.512950 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.512796 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-wtcnz\"" Apr 22 15:42:18.517924 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.517873 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c"] Apr 22 15:42:18.689332 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.689292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddqn5\" (UniqueName: \"kubernetes.io/projected/1d2087aa-27b3-4294-ba43-d48f6443d0e6-kube-api-access-ddqn5\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.689332 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.689336 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2087aa-27b3-4294-ba43-d48f6443d0e6-metrics-certs\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.689553 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.689372 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d2087aa-27b3-4294-ba43-d48f6443d0e6-cert\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.689553 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.689446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1d2087aa-27b3-4294-ba43-d48f6443d0e6-manager-config\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.790281 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.790195 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddqn5\" (UniqueName: \"kubernetes.io/projected/1d2087aa-27b3-4294-ba43-d48f6443d0e6-kube-api-access-ddqn5\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.790281 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.790230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2087aa-27b3-4294-ba43-d48f6443d0e6-metrics-certs\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.790281 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.790256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d2087aa-27b3-4294-ba43-d48f6443d0e6-cert\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.790496 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.790316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1d2087aa-27b3-4294-ba43-d48f6443d0e6-manager-config\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.791058 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.791039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1d2087aa-27b3-4294-ba43-d48f6443d0e6-manager-config\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.792931 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.792888 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2087aa-27b3-4294-ba43-d48f6443d0e6-metrics-certs\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.793025 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.793011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d2087aa-27b3-4294-ba43-d48f6443d0e6-cert\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.798224 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.798202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddqn5\" (UniqueName: \"kubernetes.io/projected/1d2087aa-27b3-4294-ba43-d48f6443d0e6-kube-api-access-ddqn5\") pod \"jobset-controller-manager-7889f9478d-hqh9c\" (UID: \"1d2087aa-27b3-4294-ba43-d48f6443d0e6\") " pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.820885 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.820850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:18.944585 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:18.944559 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c"] Apr 22 15:42:18.947077 ip-10-0-143-30 kubenswrapper[2577]: W0422 15:42:18.947047 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d2087aa_27b3_4294_ba43_d48f6443d0e6.slice/crio-28078a57def2441fe55841648a2cc04b49d8ac4acd37d62e93adc5ee0e5120fa WatchSource:0}: Error finding container 28078a57def2441fe55841648a2cc04b49d8ac4acd37d62e93adc5ee0e5120fa: Status 404 returned error can't find the container with id 28078a57def2441fe55841648a2cc04b49d8ac4acd37d62e93adc5ee0e5120fa Apr 22 15:42:19.638085 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:19.638049 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" event={"ID":"1d2087aa-27b3-4294-ba43-d48f6443d0e6","Type":"ContainerStarted","Data":"28078a57def2441fe55841648a2cc04b49d8ac4acd37d62e93adc5ee0e5120fa"} Apr 22 15:42:21.650265 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:21.650168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" event={"ID":"1d2087aa-27b3-4294-ba43-d48f6443d0e6","Type":"ContainerStarted","Data":"13617d0b3a393de26bb6526c0715d196b618b083aa91c6376bb4e121a0a9815e"} Apr 22 15:42:21.650265 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:21.650234 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 15:42:21.670467 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:21.670422 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" podStartSLOduration=1.239263549 podStartE2EDuration="3.670408691s" podCreationTimestamp="2026-04-22 15:42:18 +0000 UTC" firstStartedPulling="2026-04-22 15:42:18.948861402 +0000 UTC m=+508.418412179" lastFinishedPulling="2026-04-22 15:42:21.380006541 +0000 UTC m=+510.849557321" observedRunningTime="2026-04-22 15:42:21.669322702 +0000 UTC m=+511.138873503" watchObservedRunningTime="2026-04-22 15:42:21.670408691 +0000 UTC m=+511.139959487" Apr 22 15:42:32.658690 ip-10-0-143-30 kubenswrapper[2577]: I0422 15:42:32.658655 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-7889f9478d-hqh9c" Apr 22 16:28:11.180647 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:11.180562 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gkzsd_b89c0136-9aaf-4dfa-9c2c-f576dcf334b7/global-pull-secret-syncer/0.log" Apr 22 16:28:11.340758 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:11.340714 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ljlbr_9c1f5b49-88a2-49ca-a478-10b546545331/konnectivity-agent/0.log" Apr 22 16:28:11.461743 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:11.461656 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-30.ec2.internal_7dcbdaa34312fcd881c943ed9a362c63/haproxy/0.log" Apr 22 16:28:14.870549 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:14.870519 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8040e9a9-11fe-47b1-9008-0b52af51bfc0/alertmanager/0.log" Apr 22 16:28:14.896769 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:14.896739 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8040e9a9-11fe-47b1-9008-0b52af51bfc0/config-reloader/0.log" Apr 22 16:28:14.922892 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:14.922865 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8040e9a9-11fe-47b1-9008-0b52af51bfc0/kube-rbac-proxy-web/0.log" Apr 22 16:28:14.944864 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:14.944842 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8040e9a9-11fe-47b1-9008-0b52af51bfc0/kube-rbac-proxy/0.log" Apr 22 16:28:14.968035 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:14.968011 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8040e9a9-11fe-47b1-9008-0b52af51bfc0/kube-rbac-proxy-metric/0.log" Apr 22 16:28:14.995594 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:14.995574 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8040e9a9-11fe-47b1-9008-0b52af51bfc0/prom-label-proxy/0.log" Apr 22 16:28:15.023322 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.023295 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8040e9a9-11fe-47b1-9008-0b52af51bfc0/init-config-reloader/0.log" Apr 22 16:28:15.244177 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.244145 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2g6b8_0e7ee1be-fd13-4aac-b8e4-c8d8f5664851/node-exporter/0.log" Apr 22 16:28:15.266109 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.266084 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2g6b8_0e7ee1be-fd13-4aac-b8e4-c8d8f5664851/kube-rbac-proxy/0.log" Apr 22 16:28:15.293909 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.293881 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2g6b8_0e7ee1be-fd13-4aac-b8e4-c8d8f5664851/init-textfile/0.log" Apr 22 16:28:15.480320 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.480294 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kpckp_0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4/kube-rbac-proxy-main/0.log" Apr 22 16:28:15.507595 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.507518 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kpckp_0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4/kube-rbac-proxy-self/0.log" Apr 22 16:28:15.529715 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.529692 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-kpckp_0ed083c6-2f3b-4a6b-8e4e-1663652dd2b4/openshift-state-metrics/0.log" Apr 22 16:28:15.839887 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.839797 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-qltc5_c909177c-1f2c-45a6-b86e-d1ebca982be2/prometheus-operator-admission-webhook/0.log" Apr 22 16:28:15.888456 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.888418 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5bfcdf575-k48q8_ea023f6d-4e9d-4614-96f2-d0fe9f050310/telemeter-client/0.log" Apr 22 16:28:15.919116 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.919095 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5bfcdf575-k48q8_ea023f6d-4e9d-4614-96f2-d0fe9f050310/reload/0.log" Apr 22 16:28:15.943717 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:15.943682 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5bfcdf575-k48q8_ea023f6d-4e9d-4614-96f2-d0fe9f050310/kube-rbac-proxy/0.log" Apr 22 16:28:17.199776 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:17.199744 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-542rl_f1bbe654-6603-4266-aaa8-3f2f9852be83/networking-console-plugin/0.log" Apr 22 16:28:17.992665 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:17.992634 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-s24ht_d15a5003-51ba-4bc8-a06b-2215afd58ed5/download-server/0.log" Apr 22 16:28:18.287583 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.287496 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj"] Apr 22 16:28:18.291031 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.291006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.293237 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.293197 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrh8b\"/\"openshift-service-ca.crt\"" Apr 22 16:28:18.293380 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.293301 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrh8b\"/\"kube-root-ca.crt\"" Apr 22 16:28:18.294714 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.294697 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jrh8b\"/\"default-dockercfg-rrfw9\"" Apr 22 16:28:18.300325 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.300292 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj"] Apr 22 16:28:18.421177 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.421150 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-pln49_015ea955-83cf-4c04-8d1e-238e49a24f54/volume-data-source-validator/0.log" Apr 22 16:28:18.424259 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.424237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbjrc\" (UniqueName: \"kubernetes.io/projected/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-kube-api-access-fbjrc\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.424334 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.424271 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-proc\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.424334 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.424302 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-sys\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.424404 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.424378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-lib-modules\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.424439 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.424426 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-podres\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.525183 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.525144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbjrc\" (UniqueName: \"kubernetes.io/projected/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-kube-api-access-fbjrc\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.525183 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.525186 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-proc\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.525425 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.525211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-sys\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.525425 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.525230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-lib-modules\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.525425 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.525260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-podres\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.525425 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.525303 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-proc\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.525425 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.525303 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-sys\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.525425 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.525373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-podres\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.525425 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.525396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-lib-modules\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.533514 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.533479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbjrc\" (UniqueName: \"kubernetes.io/projected/e74c5ff9-907b-48d2-87b9-f10f173f6bf4-kube-api-access-fbjrc\") pod \"perf-node-gather-daemonset-wkxjj\" (UID: \"e74c5ff9-907b-48d2-87b9-f10f173f6bf4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.609168 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.609075 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.730946 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.730923 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj"] Apr 22 16:28:18.733338 ip-10-0-143-30 kubenswrapper[2577]: W0422 16:28:18.733309 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode74c5ff9_907b_48d2_87b9_f10f173f6bf4.slice/crio-4486bd5f5ac08b9032185390373adc0049594753e3c39142990a3665366c9094 WatchSource:0}: Error finding container 4486bd5f5ac08b9032185390373adc0049594753e3c39142990a3665366c9094: Status 404 returned error can't find the container with id 4486bd5f5ac08b9032185390373adc0049594753e3c39142990a3665366c9094 Apr 22 16:28:18.735008 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.734988 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:28:18.932547 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.932473 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" event={"ID":"e74c5ff9-907b-48d2-87b9-f10f173f6bf4","Type":"ContainerStarted","Data":"0b021f35f419615c1c17f4a761b081f160978ca6a6e343c795a711d4a6da9d67"} Apr 22 16:28:18.932547 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.932509 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" event={"ID":"e74c5ff9-907b-48d2-87b9-f10f173f6bf4","Type":"ContainerStarted","Data":"4486bd5f5ac08b9032185390373adc0049594753e3c39142990a3665366c9094"} Apr 22 16:28:18.932743 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.932600 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:18.950534 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:18.950494 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" podStartSLOduration=0.950482327 podStartE2EDuration="950.482327ms" podCreationTimestamp="2026-04-22 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:28:18.948312837 +0000 UTC m=+3268.417863636" watchObservedRunningTime="2026-04-22 16:28:18.950482327 +0000 UTC m=+3268.420033126" Apr 22 16:28:19.163387 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:19.163355 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-swhzp_74e2d982-4c5d-4d80-8ce4-b86491c0f765/dns/0.log" Apr 22 16:28:19.184492 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:19.184414 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-swhzp_74e2d982-4c5d-4d80-8ce4-b86491c0f765/kube-rbac-proxy/0.log" Apr 22 16:28:19.230647 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:19.230617 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-vqw4p_45497d38-5523-4037-9d5e-b2d5cf55efc2/dns-node-resolver/0.log" Apr 22 16:28:19.668714 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:19.668672 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-f98788744-gfqth_3ebe0392-9655-4115-b039-44138ef50068/registry/0.log" Apr 22 16:28:19.733583 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:19.733536 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wkqg5_0acda737-f5c9-4897-bd5e-94296fc02284/node-ca/0.log" Apr 22 16:28:20.711510 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:20.711474 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8njnj_a1de4878-d335-41a9-a163-5332f8e575d6/serve-healthcheck-canary/0.log" Apr 22 16:28:21.100347 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:21.100316 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87d75_c7193304-0a5d-46fd-89e7-9c0b2192fa40/kube-rbac-proxy/0.log" Apr 22 16:28:21.120125 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:21.120088 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87d75_c7193304-0a5d-46fd-89e7-9c0b2192fa40/exporter/0.log" Apr 22 16:28:21.141135 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:21.141104 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87d75_c7193304-0a5d-46fd-89e7-9c0b2192fa40/extractor/0.log" Apr 22 16:28:22.816997 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:22.816965 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-7889f9478d-hqh9c_1d2087aa-27b3-4294-ba43-d48f6443d0e6/manager/0.log" Apr 22 16:28:22.841355 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:22.841317 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-mn45p_48b3cb8e-6a8f-4fb1-96ba-8f6b2fea700d/jobset-operator/0.log" Apr 22 16:28:24.947004 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:24.946968 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-wkxjj" Apr 22 16:28:27.238307 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.238273 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x6qrc_2b6e7ecf-d54b-4238-8dd4-a8502eb2627e/kube-multus-additional-cni-plugins/0.log" Apr 22 16:28:27.258416 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.258389 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x6qrc_2b6e7ecf-d54b-4238-8dd4-a8502eb2627e/egress-router-binary-copy/0.log" Apr 22 16:28:27.279420 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.279365 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x6qrc_2b6e7ecf-d54b-4238-8dd4-a8502eb2627e/cni-plugins/0.log" Apr 22 16:28:27.301050 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.301026 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x6qrc_2b6e7ecf-d54b-4238-8dd4-a8502eb2627e/bond-cni-plugin/0.log" Apr 22 16:28:27.321510 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.321487 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x6qrc_2b6e7ecf-d54b-4238-8dd4-a8502eb2627e/routeoverride-cni/0.log" Apr 22 16:28:27.343822 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.343801 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x6qrc_2b6e7ecf-d54b-4238-8dd4-a8502eb2627e/whereabouts-cni-bincopy/0.log" Apr 22 16:28:27.364388 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.364362 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x6qrc_2b6e7ecf-d54b-4238-8dd4-a8502eb2627e/whereabouts-cni/0.log" Apr 22 16:28:27.583431 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.583324 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gxvvw_8644527b-1d6e-4618-95f6-57427939a8e7/kube-multus/0.log" Apr 22 16:28:27.733457 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.733421 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zzcmv_f7958036-9067-47bb-91dc-dc565feb289a/network-metrics-daemon/0.log" Apr 22 16:28:27.754519 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:27.754478 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zzcmv_f7958036-9067-47bb-91dc-dc565feb289a/kube-rbac-proxy/0.log" Apr 22 16:28:28.495879 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:28.495847 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpdw_bf7bc51e-13ec-42f9-912c-b51cd7134006/ovn-controller/0.log" Apr 22 16:28:28.528344 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:28.528304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpdw_bf7bc51e-13ec-42f9-912c-b51cd7134006/ovn-acl-logging/0.log" Apr 22 16:28:28.546636 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:28.546602 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpdw_bf7bc51e-13ec-42f9-912c-b51cd7134006/kube-rbac-proxy-node/0.log" Apr 22 16:28:28.566841 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:28.566761 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpdw_bf7bc51e-13ec-42f9-912c-b51cd7134006/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 16:28:28.583295 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:28.583259 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpdw_bf7bc51e-13ec-42f9-912c-b51cd7134006/northd/0.log" Apr 22 16:28:28.602629 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:28.602601 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpdw_bf7bc51e-13ec-42f9-912c-b51cd7134006/nbdb/0.log" Apr 22 16:28:28.622271 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:28.622236 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpdw_bf7bc51e-13ec-42f9-912c-b51cd7134006/sbdb/0.log" Apr 22 16:28:28.714128 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:28.714093 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4qpdw_bf7bc51e-13ec-42f9-912c-b51cd7134006/ovnkube-controller/0.log" Apr 22 16:28:30.468575 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:30.468543 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zhpd9_f537b0eb-8087-4572-b237-83ff59e51f13/network-check-target-container/0.log" Apr 22 16:28:31.339153 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:31.339113 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-wd2xg_01fd759b-aafb-49c6-a60e-5424150b1157/iptables-alerter/0.log" Apr 22 16:28:31.933846 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:31.933815 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-vmhqd_168184b4-2b79-4b84-9cff-2a4fe584c2ab/tuned/0.log" Apr 22 16:28:34.453957 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:34.453869 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-lhbsj_31ebe0d7-a716-4a01-85dd-f78472a7c41b/service-ca-operator/1.log" Apr 22 16:28:34.454909 ip-10-0-143-30 kubenswrapper[2577]: I0422 16:28:34.454879 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-lhbsj_31ebe0d7-a716-4a01-85dd-f78472a7c41b/service-ca-operator/0.log"