Apr 16 19:53:58.398169 ip-10-0-130-164 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:53:58.897486 ip-10-0-130-164 kubenswrapper[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:58.897486 ip-10-0-130-164 kubenswrapper[2561]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:53:58.897486 ip-10-0-130-164 kubenswrapper[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:58.897486 ip-10-0-130-164 kubenswrapper[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:53:58.897486 ip-10-0-130-164 kubenswrapper[2561]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:53:58.900109 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.900017 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:53:58.904858 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904843 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:58.904858 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904858 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904862 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904866 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904869 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904873 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904876 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904879 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904882 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904885 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904887 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904890 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904893 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904896 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904899 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904902 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904905 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904908 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904911 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904913 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904916 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:58.904921 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904918 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904922 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904924 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904927 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904930 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904933 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904936 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904938 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904941 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904943 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904946 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904949 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904952 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904955 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904958 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904960 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904963 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904965 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904968 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904970 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:58.905397 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904973 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904976 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904979 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904982 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904984 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904987 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904989 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904992 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904994 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904997 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.904999 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905002 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905004 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905008 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905011 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905014 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905017 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905019 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905022 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905024 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:58.905908 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905027 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905030 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905032 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905035 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905037 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905040 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905043 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905047 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905051 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905055 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905058 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905061 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905063 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905066 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905071 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905075 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905078 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905081 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905083 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:58.906390 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905086 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905089 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905091 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905094 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905096 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905099 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905488 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905494 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905497 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905501 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905503 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905506 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905509 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905511 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905514 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905516 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905519 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905521 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905524 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:58.906863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905526 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905529 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905533 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905537 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905540 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905543 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905545 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905548 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905550 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905553 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905555 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905558 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905560 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905563 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905566 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905568 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905573 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905576 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905579 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:58.907447 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905590 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905595 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905598 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905601 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905604 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905607 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905609 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905612 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905614 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905617 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905620 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905622 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905625 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905627 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905630 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905633 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905636 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905638 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905641 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:58.907930 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905643 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905646 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905649 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905651 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905653 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905656 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905659 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905662 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905665 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905668 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905670 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905673 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905676 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905678 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905681 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905683 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905686 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905688 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905691 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905694 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:58.908400 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905697 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905699 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905701 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905704 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905706 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905709 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905711 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905714 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905716 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905719 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905722 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905724 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905727 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905730 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.905732 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907223 2561 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907233 2561 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907240 2561 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907245 2561 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907250 2561 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907254 2561 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:53:58.908901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907258 2561 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907263 2561 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907267 2561 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907270 2561 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907274 2561 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907277 2561 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907280 2561 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907283 2561 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907286 2561 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907290 2561 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907293 2561 flags.go:64] FLAG: --cloud-config="" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907295 2561 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907298 2561 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907303 2561 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907306 2561 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907310 2561 flags.go:64] FLAG: --config-dir="" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907313 2561 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907317 2561 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907333 2561 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907336 2561 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907340 2561 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907343 2561 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907347 2561 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907350 2561 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:53:58.909421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907353 2561 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907357 2561 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907360 2561 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907364 2561 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907367 2561 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907374 2561 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907377 2561 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907380 2561 flags.go:64] FLAG: --enable-server="true" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907383 2561 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907388 2561 flags.go:64] FLAG: --event-burst="100" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907391 2561 flags.go:64] FLAG: --event-qps="50" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907394 2561 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907398 2561 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907401 2561 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907405 2561 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907408 2561 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907411 2561 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907414 2561 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907417 2561 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907421 2561 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907424 2561 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907427 2561 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907430 2561 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907433 2561 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907437 2561 flags.go:64] FLAG: --feature-gates="" Apr 16 19:53:58.910029 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907441 2561 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907444 2561 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907448 2561 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907451 2561 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907455 2561 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907459 2561 flags.go:64] FLAG: --help="false" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907462 2561 flags.go:64] FLAG: --hostname-override="ip-10-0-130-164.ec2.internal" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907465 2561 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907468 2561 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907471 2561 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907475 2561 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907478 2561 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907482 2561 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907485 2561 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907487 2561 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907490 2561 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907493 2561 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907496 2561 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907499 2561 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907502 2561 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907505 2561 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907508 2561 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907511 2561 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907514 2561 flags.go:64] FLAG: --lock-file="" Apr 16 19:53:58.910627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907517 2561 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907520 2561 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907523 2561 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907528 2561 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907531 2561 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907534 2561 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907537 2561 flags.go:64] FLAG: --logging-format="text" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907541 2561 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907545 2561 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907547 2561 flags.go:64] FLAG: --manifest-url="" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907550 2561 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907555 2561 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907558 2561 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907563 2561 flags.go:64] FLAG: --max-pods="110" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907570 2561 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907573 2561 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907576 2561 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907579 2561 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907582 2561 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907585 2561 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907588 2561 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907595 2561 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907599 2561 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907602 2561 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:53:58.911214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907605 2561 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907608 2561 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907613 2561 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907616 2561 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907620 2561 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907623 2561 flags.go:64] FLAG: --port="10250" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907626 2561 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907630 2561 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05d5fd082d037b412" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907633 2561 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907636 2561 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907639 2561 flags.go:64] FLAG: --register-node="true" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907642 2561 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907645 2561 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907649 2561 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907652 2561 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907655 2561 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907658 2561 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907664 2561 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907668 2561 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907670 2561 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907674 2561 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907677 2561 flags.go:64] FLAG: --runonce="false" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907680 2561 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907683 2561 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907686 2561 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:53:58.911805 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907689 2561 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907692 2561 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907695 2561 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907698 2561 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907701 2561 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907704 2561 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907707 2561 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907710 2561 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907713 2561 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907716 2561 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907719 2561 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907722 2561 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907727 2561 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907730 2561 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907733 2561 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907737 2561 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907740 2561 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907743 2561 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907746 2561 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907749 2561 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907752 2561 flags.go:64] FLAG: --v="2" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907756 2561 flags.go:64] FLAG: --version="false" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907760 2561 flags.go:64] FLAG: --vmodule="" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907765 2561 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:53:58.912466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.907768 2561 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907894 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907900 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907904 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907907 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907910 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907914 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907917 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907920 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907923 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907926 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907929 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907932 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907935 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907937 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907940 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907943 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907945 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907948 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907951 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:58.913049 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907953 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907955 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907958 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907960 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907963 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907965 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907968 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907971 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907973 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907976 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907978 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907981 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907983 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907986 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907988 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907991 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907993 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.907996 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908000 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908003 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:58.913550 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908006 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908008 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908011 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908014 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908017 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908019 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908022 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908024 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908026 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908029 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908032 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908034 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908036 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908039 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908041 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908044 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908046 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908049 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908052 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908054 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:58.914279 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908057 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908060 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908063 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908066 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908068 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908071 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908075 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908078 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908081 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908084 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908088 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908091 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908093 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908096 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908099 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908102 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908105 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908107 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908110 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:58.915094 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908113 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:58.915625 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908115 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:58.915625 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908118 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:58.915625 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908120 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:58.915625 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908123 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:58.915625 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908126 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:58.915625 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908128 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:58.915625 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.908131 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:58.915625 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.908136 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.915628 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.915645 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915695 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915700 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915704 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915708 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915711 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915714 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915717 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915720 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915722 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915725 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915728 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915730 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915733 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915736 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915738 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915741 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:58.915863 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915745 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915749 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915752 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915755 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915757 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915760 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915763 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915765 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915768 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915770 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915773 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915776 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915779 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915781 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915784 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915787 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915790 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915793 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915795 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915798 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:58.916368 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915801 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915803 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915806 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915808 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915811 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915814 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915816 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915819 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915822 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915842 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915846 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915849 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915852 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915856 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915861 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915864 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915867 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915870 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915874 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:58.916883 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915877 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915880 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915883 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915886 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915889 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915891 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915894 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915897 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915900 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915903 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915906 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915908 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915911 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915914 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915916 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915919 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915921 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915924 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915927 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915929 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:58.917353 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915932 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915935 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915938 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915940 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915943 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915946 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915948 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915951 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915954 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915956 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.915959 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.915964 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916058 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916063 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916066 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916069 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:53:58.917871 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916072 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916075 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916078 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916081 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916084 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916088 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916091 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916094 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916096 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916099 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916102 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916104 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916107 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916109 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916112 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916115 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916117 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916121 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916126 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916129 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:53:58.918270 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916132 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916135 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916137 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916141 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916144 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916146 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916149 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916152 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916154 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916157 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916160 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916162 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916165 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916168 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916170 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916173 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916175 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916178 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916181 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916184 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:53:58.918751 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916186 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916189 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916191 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916194 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916196 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916199 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916201 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916204 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916206 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916209 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916211 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916215 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916217 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916220 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916223 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916225 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916228 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916230 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916233 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:53:58.919264 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916236 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916238 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916241 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916243 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916245 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916248 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916250 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916253 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916256 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916259 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916263 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916266 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916268 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916271 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916274 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916276 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916279 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916281 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916283 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:53:58.919734 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916286 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:53:58.920208 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916289 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:53:58.920208 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916291 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:53:58.920208 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:53:58.916294 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:53:58.920208 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.916299 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:53:58.920208 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.916928 2561 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:53:58.920358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.920344 2561 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:53:58.921266 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.921255 2561 server.go:1019] "Starting client certificate rotation" Apr 16 19:53:58.921367 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.921353 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:58.921412 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.921390 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:53:58.943960 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.943940 2561 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:58.946502 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.946485 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:53:58.966581 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.966559 2561 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:53:58.975006 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.974974 2561 log.go:25] "Validated CRI v1 image API" Apr 16 19:53:58.976398 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.976377 2561 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:53:58.977719 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.977701 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:53:58.979444 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.979422 2561 fs.go:135] Filesystem UUIDs: map[0081c59e-6085-4d06-99d0-e29a303ece4b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 e056ecf7-bd7a-4484-9908-7dd67162730d:/dev/nvme0n1p4] Apr 16 19:53:58.979517 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.979443 2561 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:53:58.985879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.985759 2561 manager.go:217] Machine: {Timestamp:2026-04-16 19:53:58.983164568 +0000 UTC m=+0.466313481 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3109718 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec293a1eece60a1cceb27219bb1a48d0 SystemUUID:ec293a1e-ece6-0a1c-ceb2-7219bb1a48d0 BootID:68610336-4c01-4969-9304-5bdf11269142 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:75:fd:ec:7c:7d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:75:fd:ec:7c:7d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:95:47:e1:0b:6b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:53:58.985879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.985875 2561 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:53:58.986002 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.985990 2561 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:53:58.989189 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.989159 2561 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:53:58.989326 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.989190 2561 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-164.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:53:58.989370 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.989336 2561 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:53:58.989370 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.989343 2561 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:53:58.989370 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.989356 2561 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:58.989452 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.989375 2561 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:53:58.990524 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.990513 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:58.990634 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.990625 2561 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:53:58.993836 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.993819 2561 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:53:58.993874 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.993843 2561 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:53:58.993874 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.993856 2561 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:53:58.993874 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.993865 2561 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:53:58.993973 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.993893 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:53:58.996446 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.996434 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:58.996490 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.996453 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:53:58.999729 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:58.999714 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:53:59.001294 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.001274 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kpsgc" Apr 16 19:53:59.001680 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.001667 2561 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:53:59.002816 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002802 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:53:59.002875 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002841 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:53:59.002875 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002855 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:53:59.002875 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002866 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:53:59.002963 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002909 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:53:59.002963 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002922 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:53:59.002963 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002936 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:53:59.002963 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002948 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:53:59.002963 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002962 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:53:59.003090 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.002975 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:53:59.003090 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.003001 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:53:59.003090 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.003018 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:53:59.003775 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.003764 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:53:59.003809 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.003781 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:53:59.006677 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.006655 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-164.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:53:59.006677 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.006661 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:53:59.007917 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.007904 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:53:59.007977 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.007944 2561 server.go:1295] "Started kubelet" Apr 16 19:53:59.008080 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.008043 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:53:59.008145 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.008092 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:53:59.008182 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.008166 2561 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:53:59.008744 ip-10-0-130-164 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:53:59.009882 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.009854 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:53:59.010300 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.010278 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kpsgc" Apr 16 19:53:59.010690 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.010678 2561 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:53:59.016092 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.016073 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:53:59.016810 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.016792 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:53:59.019338 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.019085 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.019604 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.019588 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:53:59.020489 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.019851 2561 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:53:59.020489 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.019869 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:53:59.020489 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.020055 2561 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:53:59.020489 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.020064 2561 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:53:59.021338 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.021137 2561 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:53:59.023883 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.023104 2561 factory.go:55] Registering systemd factory Apr 16 19:53:59.024062 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.024050 2561 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:53:59.024222 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.023977 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:59.024453 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.024440 2561 factory.go:153] Registering CRI-O factory Apr 16 19:53:59.024538 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.024457 2561 factory.go:223] Registration of the crio container factory successfully Apr 16 19:53:59.024576 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.024554 2561 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:53:59.024611 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.024588 2561 factory.go:103] Registering Raw factory Apr 16 19:53:59.024649 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.024614 2561 manager.go:1196] Started watching for new ooms in manager Apr 16 19:53:59.025268 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.025252 2561 manager.go:319] Starting recovery of all containers Apr 16 19:53:59.029101 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.029080 2561 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-164.ec2.internal" not found Apr 16 19:53:59.029197 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.029106 2561 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-164.ec2.internal\" not found" node="ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.035754 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.035734 2561 manager.go:324] Recovery completed Apr 16 19:53:59.039659 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.039647 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:59.042240 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.042224 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:59.042304 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.042251 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:59.042304 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.042262 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:59.042748 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.042732 2561 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:53:59.042748 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.042747 2561 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:53:59.042847 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.042786 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:53:59.045173 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.045160 2561 policy_none.go:49] "None policy: Start" Apr 16 19:53:59.045227 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.045176 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:53:59.045227 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.045187 2561 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:53:59.045924 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.045912 2561 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-164.ec2.internal" not found Apr 16 19:53:59.083420 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.083402 2561 manager.go:341] "Starting Device Plugin manager" Apr 16 19:53:59.089107 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.083440 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:53:59.089107 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.083455 2561 server.go:85] "Starting device plugin registration server" Apr 16 19:53:59.089107 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.083768 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:53:59.089107 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.083777 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:53:59.089107 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.083888 2561 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:53:59.089107 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.083990 2561 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:53:59.089107 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.084000 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:53:59.089107 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.084420 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:53:59.089107 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.084460 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.105453 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.105426 2561 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-164.ec2.internal" not found Apr 16 19:53:59.148681 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.148611 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:53:59.149766 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.149751 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:53:59.149860 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.149776 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:53:59.149860 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.149794 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:53:59.149860 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.149800 2561 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:53:59.149860 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.149845 2561 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:53:59.153448 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.153419 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:59.184823 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.184800 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:59.185649 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.185633 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:59.185721 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.185664 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:59.185721 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.185676 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:59.185721 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.185700 2561 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.195448 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.195432 2561 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.195500 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.195452 2561 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-164.ec2.internal\": node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.208757 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.208736 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.250569 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.250534 2561 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal"] Apr 16 19:53:59.250672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.250622 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:59.251518 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.251503 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:59.251594 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.251529 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:59.251594 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.251543 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:59.252979 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.252967 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:59.253126 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.253113 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.253162 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.253142 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:59.253700 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.253684 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:59.253700 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.253694 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:59.253784 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.253707 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:59.253784 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.253713 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:59.253784 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.253717 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:59.253784 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.253723 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:59.255023 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.255010 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.255069 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.255035 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:53:59.255665 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.255649 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:53:59.255765 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.255673 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:53:59.255765 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.255686 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:53:59.278420 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.278397 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-164.ec2.internal\" not found" node="ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.282808 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.282791 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-164.ec2.internal\" not found" node="ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.309637 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.309617 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.321980 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.321959 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc485f6a70ac1e219db7780dcfe225f6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal\" (UID: \"fc485f6a70ac1e219db7780dcfe225f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.322045 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.321985 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3a0a5c69e63ac560ca3e107bf62451fd-config\") pod \"kube-apiserver-proxy-ip-10-0-130-164.ec2.internal\" (UID: \"3a0a5c69e63ac560ca3e107bf62451fd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.322045 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.322004 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc485f6a70ac1e219db7780dcfe225f6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal\" (UID: \"fc485f6a70ac1e219db7780dcfe225f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.410332 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.410257 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.423162 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.423144 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc485f6a70ac1e219db7780dcfe225f6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal\" (UID: \"fc485f6a70ac1e219db7780dcfe225f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.423231 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.423170 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc485f6a70ac1e219db7780dcfe225f6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal\" (UID: \"fc485f6a70ac1e219db7780dcfe225f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.423231 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.423188 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3a0a5c69e63ac560ca3e107bf62451fd-config\") pod \"kube-apiserver-proxy-ip-10-0-130-164.ec2.internal\" (UID: \"3a0a5c69e63ac560ca3e107bf62451fd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.423231 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.423226 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3a0a5c69e63ac560ca3e107bf62451fd-config\") pod \"kube-apiserver-proxy-ip-10-0-130-164.ec2.internal\" (UID: \"3a0a5c69e63ac560ca3e107bf62451fd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.423337 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.423241 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/fc485f6a70ac1e219db7780dcfe225f6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal\" (UID: \"fc485f6a70ac1e219db7780dcfe225f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.423337 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.423255 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc485f6a70ac1e219db7780dcfe225f6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal\" (UID: \"fc485f6a70ac1e219db7780dcfe225f6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.510382 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.510352 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.580925 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.580894 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.585504 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.585488 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal" Apr 16 19:53:59.611241 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.611218 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.711774 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.711692 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.812179 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.812150 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.853485 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.853458 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:53:59.912314 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:53:59.912272 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:53:59.921508 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.921493 2561 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:53:59.921635 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.921618 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:59.921672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.921655 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:53:59.921704 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:53:59.921655 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:54:00.012317 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.012272 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:48:59 +0000 UTC" deadline="2027-11-19 22:00:24.254562556 +0000 UTC" Apr 16 19:54:00.012317 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.012311 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13970h6m24.242254173s" Apr 16 19:54:00.012526 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:00.012343 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:54:00.016492 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.016469 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:54:00.031628 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.031606 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:54:00.058276 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:00.058248 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a0a5c69e63ac560ca3e107bf62451fd.slice/crio-19d93d250a8ff3a4b1d2698f0021967fe0029580019eafe366206e158d9d0cd0 WatchSource:0}: Error finding container 19d93d250a8ff3a4b1d2698f0021967fe0029580019eafe366206e158d9d0cd0: Status 404 returned error can't find the container with id 19d93d250a8ff3a4b1d2698f0021967fe0029580019eafe366206e158d9d0cd0 Apr 16 19:54:00.059028 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:00.059006 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc485f6a70ac1e219db7780dcfe225f6.slice/crio-0c18cad6dea2667bdfd6d4390884b853718908fb2c82dfaabf7e8a82200b9ab1 WatchSource:0}: Error finding container 0c18cad6dea2667bdfd6d4390884b853718908fb2c82dfaabf7e8a82200b9ab1: Status 404 returned error can't find the container with id 0c18cad6dea2667bdfd6d4390884b853718908fb2c82dfaabf7e8a82200b9ab1 Apr 16 19:54:00.059377 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.059362 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vbfbg" Apr 16 19:54:00.062635 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.062617 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:54:00.065235 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.065219 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vbfbg" Apr 16 19:54:00.112644 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:00.112608 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:54:00.152937 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.152889 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal" event={"ID":"3a0a5c69e63ac560ca3e107bf62451fd","Type":"ContainerStarted","Data":"19d93d250a8ff3a4b1d2698f0021967fe0029580019eafe366206e158d9d0cd0"} Apr 16 19:54:00.153823 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.153792 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" event={"ID":"fc485f6a70ac1e219db7780dcfe225f6","Type":"ContainerStarted","Data":"0c18cad6dea2667bdfd6d4390884b853718908fb2c82dfaabf7e8a82200b9ab1"} Apr 16 19:54:00.213042 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:00.212980 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:54:00.313512 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:00.313488 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:54:00.414092 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:00.414057 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-164.ec2.internal\" not found" Apr 16 19:54:00.450178 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.450141 2561 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:00.520453 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.520354 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" Apr 16 19:54:00.531320 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.531217 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:54:00.532148 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.532126 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal" Apr 16 19:54:00.538076 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.538047 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:54:00.945635 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.945552 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:00.964717 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.964690 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:00.995939 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:00.995908 2561 apiserver.go:52] "Watching apiserver" Apr 16 19:54:01.001323 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.001300 2561 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:54:01.002329 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.002297 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-jkgr6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal","openshift-multus/multus-6j2q7","openshift-network-operator/iptables-alerter-ck9lp","openshift-ovn-kubernetes/ovnkube-node-t5dmh","kube-system/konnectivity-agent-h2cnc","kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal","openshift-cluster-node-tuning-operator/tuned-t9q6h","openshift-image-registry/node-ca-xbhwz","openshift-multus/multus-additional-cni-plugins-c96xs","openshift-multus/network-metrics-daemon-hkqtk"] Apr 16 19:54:01.004247 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.004228 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:01.004343 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.004310 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:01.005316 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.005297 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.006425 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.006401 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.007463 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.007447 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.008443 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.008416 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:54:01.008533 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.008414 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:54:01.008533 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.008423 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pzhfx\"" Apr 16 19:54:01.008533 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.008524 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:54:01.008924 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.008907 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.010072 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.010053 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:01.011251 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.011235 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.012559 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.012503 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.015628 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.015365 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:54:01.015628 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.015535 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:01.015767 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.015738 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8sc85\"" Apr 16 19:54:01.015821 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.015763 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:54:01.015821 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.015788 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:54:01.016057 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.016035 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-thpgj\"" Apr 16 19:54:01.016236 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.016207 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:54:01.016420 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.016404 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q6cfv\"" Apr 16 19:54:01.016482 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.016433 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:54:01.016716 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.016698 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:54:01.016781 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.016770 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:54:01.016846 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.016796 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:54:01.016981 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.016962 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:54:01.017068 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.017052 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:54:01.017593 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.017574 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.017919 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.017903 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:54:01.018001 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.017935 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hdjfc\"" Apr 16 19:54:01.018001 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.017995 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:01.018153 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.018132 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:54:01.018190 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.018160 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:54:01.018238 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.018199 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:54:01.018320 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.018306 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:54:01.018320 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.018313 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:54:01.018421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.018357 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-x25sx\"" Apr 16 19:54:01.018475 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.018444 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:54:01.018475 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.018464 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:54:01.018675 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.018657 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-94xj9\"" Apr 16 19:54:01.019150 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.019134 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:01.019223 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.019187 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:01.020961 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.020945 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:54:01.025492 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.025471 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:54:01.025627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.025610 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:54:01.025842 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.025812 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m46wp\"" Apr 16 19:54:01.030754 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.030735 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-tuned\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.030872 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.030767 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-run-systemd\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.030872 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.030793 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.030984 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.030817 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8890691e-8929-4051-8642-9a4556f51961-konnectivity-ca\") pod \"konnectivity-agent-h2cnc\" (UID: \"8890691e-8929-4051-8642-9a4556f51961\") " pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:01.030984 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.030905 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-socket-dir-parent\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.030984 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.030931 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-hostroot\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.030984 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.030954 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-etc-openvswitch\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.031141 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031003 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-run-ovn-kubernetes\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.031141 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031035 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-cni-netd\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.031141 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031056 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-var-lib-kubelet\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.031141 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031071 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5-host-slash\") pod \"iptables-alerter-ck9lp\" (UID: \"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5\") " pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.031141 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031087 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/012be36b-5039-4ab8-82de-702be926779a-host\") pod \"node-ca-xbhwz\" (UID: \"012be36b-5039-4ab8-82de-702be926779a\") " pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.031141 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031111 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-device-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.031404 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031149 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltlvg\" (UniqueName: \"kubernetes.io/projected/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-kube-api-access-ltlvg\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.031404 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031165 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-run-multus-certs\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.031404 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031187 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-var-lib-kubelet\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.031404 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031222 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-run-ovn\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.031404 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031270 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-run-k8s-cni-cncf-io\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.031404 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031299 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-env-overrides\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.031404 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031339 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-ovn-node-metrics-cert\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.031404 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031379 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-cni-dir\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031406 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-run-netns\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031430 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-lib-modules\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031455 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-system-cni-dir\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031478 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031515 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-sysconfig\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031542 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031574 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031610 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-registration-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031643 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-slash\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031669 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-log-socket\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.031711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031695 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031720 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-var-lib-cni-bin\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031765 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-system-cni-dir\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031810 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7ghk\" (UniqueName: \"kubernetes.io/projected/8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5-kube-api-access-c7ghk\") pod \"iptables-alerter-ck9lp\" (UID: \"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5\") " pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031848 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-run\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031881 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/012be36b-5039-4ab8-82de-702be926779a-serviceca\") pod \"node-ca-xbhwz\" (UID: \"012be36b-5039-4ab8-82de-702be926779a\") " pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031902 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-sys-fs\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031925 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-var-lib-openvswitch\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031962 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-cnibin\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.031997 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-cnibin\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032017 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-modprobe-d\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032045 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58l5g\" (UniqueName: \"kubernetes.io/projected/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-kube-api-access-58l5g\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032062 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-socket-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032085 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-systemd-units\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032120 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-cni-bin\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.032161 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032149 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnhqd\" (UniqueName: \"kubernetes.io/projected/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-kube-api-access-xnhqd\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032181 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5-iptables-alerter-script\") pod \"iptables-alerter-ck9lp\" (UID: \"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5\") " pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032204 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-conf-dir\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032226 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-daemon-config\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032247 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-etc-kubernetes\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032273 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-var-lib-cni-multus\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032306 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-sysctl-conf\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032330 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-os-release\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032355 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87e620d6-c02b-44e6-897b-45e0488dc88a-cni-binary-copy\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032389 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-tmp\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032426 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-kubelet\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032451 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-node-log\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032476 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8890691e-8929-4051-8642-9a4556f51961-agent-certs\") pod \"konnectivity-agent-h2cnc\" (UID: \"8890691e-8929-4051-8642-9a4556f51961\") " pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032512 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-sysctl-d\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032542 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-sys\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032564 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-host\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032602 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rfzl\" (UniqueName: \"kubernetes.io/projected/012be36b-5039-4ab8-82de-702be926779a-kube-api-access-5rfzl\") pod \"node-ca-xbhwz\" (UID: \"012be36b-5039-4ab8-82de-702be926779a\") " pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.032752 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032627 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-run-openvswitch\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032667 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-kubernetes\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032695 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w748\" (UniqueName: \"kubernetes.io/projected/24c64247-d926-464e-86ae-324af71b98d9-kube-api-access-6w748\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032733 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-run-netns\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032785 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032820 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032861 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdqsr\" (UniqueName: \"kubernetes.io/projected/87e620d6-c02b-44e6-897b-45e0488dc88a-kube-api-access-jdqsr\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032887 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-etc-selinux\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032912 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-ovnkube-script-lib\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032936 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-cni-binary-copy\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.032979 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwdpn\" (UniqueName: \"kubernetes.io/projected/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-kube-api-access-fwdpn\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.033024 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-ovnkube-config\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.033048 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-os-release\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.033441 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.033070 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-systemd\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.066502 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.066471 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:49:00 +0000 UTC" deadline="2027-12-15 05:30:44.520596946 +0000 UTC" Apr 16 19:54:01.066502 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.066499 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14577h36m43.454101419s" Apr 16 19:54:01.113977 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.113947 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dghfv"] Apr 16 19:54:01.116269 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.116250 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.116394 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.116329 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:01.133476 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133422 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58l5g\" (UniqueName: \"kubernetes.io/projected/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-kube-api-access-58l5g\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.133645 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133549 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-socket-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.133645 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133578 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-systemd-units\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.133645 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133601 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-cni-bin\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.133645 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133623 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnhqd\" (UniqueName: \"kubernetes.io/projected/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-kube-api-access-xnhqd\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.133856 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133652 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65540f57-d96d-4dcd-b7a1-010ff5e40cae-dbus\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.133856 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133693 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-systemd-units\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.133856 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133711 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5-iptables-alerter-script\") pod \"iptables-alerter-ck9lp\" (UID: \"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5\") " pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.133856 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133751 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-conf-dir\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.133856 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133754 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-socket-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.133856 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133775 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-daemon-config\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.133856 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133811 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-etc-kubernetes\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.134207 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133876 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.134207 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133885 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-conf-dir\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.134207 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133950 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-etc-kubernetes\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.134207 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.133989 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-var-lib-cni-multus\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.134207 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134010 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-cni-bin\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.134207 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134055 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-sysctl-conf\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.134207 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134065 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-var-lib-cni-multus\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.134207 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134116 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-os-release\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.134540 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134506 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-os-release\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.134640 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134603 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-sysctl-conf\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.134869 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134742 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87e620d6-c02b-44e6-897b-45e0488dc88a-cni-binary-copy\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.134869 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134784 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-daemon-config\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.134869 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134807 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-tmp\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.135044 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134872 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-kubelet\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.135044 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.134972 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-kubelet\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135173 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-node-log\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135217 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8890691e-8929-4051-8642-9a4556f51961-agent-certs\") pod \"konnectivity-agent-h2cnc\" (UID: \"8890691e-8929-4051-8642-9a4556f51961\") " pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135250 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-sysctl-d\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135291 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-sys\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135328 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-host\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135364 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87e620d6-c02b-44e6-897b-45e0488dc88a-cni-binary-copy\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135385 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rfzl\" (UniqueName: \"kubernetes.io/projected/012be36b-5039-4ab8-82de-702be926779a-kube-api-access-5rfzl\") pod \"node-ca-xbhwz\" (UID: \"012be36b-5039-4ab8-82de-702be926779a\") " pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135449 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-run-openvswitch\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135477 2561 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135516 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-run-openvswitch\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135555 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5-iptables-alerter-script\") pod \"iptables-alerter-ck9lp\" (UID: \"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5\") " pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135610 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-sysctl-d\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135637 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-node-log\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.135672 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135670 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-kubernetes\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135704 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w748\" (UniqueName: \"kubernetes.io/projected/24c64247-d926-464e-86ae-324af71b98d9-kube-api-access-6w748\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135709 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-host\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135738 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-run-netns\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135776 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135794 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-kubernetes\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135811 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135863 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdqsr\" (UniqueName: \"kubernetes.io/projected/87e620d6-c02b-44e6-897b-45e0488dc88a-kube-api-access-jdqsr\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135916 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-etc-selinux\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135949 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-ovnkube-script-lib\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135978 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-cni-binary-copy\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136008 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65540f57-d96d-4dcd-b7a1-010ff5e40cae-kubelet-config\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136045 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwdpn\" (UniqueName: \"kubernetes.io/projected/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-kube-api-access-fwdpn\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136073 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-sys\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136080 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-ovnkube-config\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136113 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-os-release\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136145 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-systemd\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.136391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136174 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-tuned\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136198 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-run-systemd\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136221 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136230 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136262 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8890691e-8929-4051-8642-9a4556f51961-konnectivity-ca\") pod \"konnectivity-agent-h2cnc\" (UID: \"8890691e-8929-4051-8642-9a4556f51961\") " pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136296 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-socket-dir-parent\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136326 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-hostroot\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136352 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-etc-openvswitch\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136382 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-run-ovn-kubernetes\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136412 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-cni-netd\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136441 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-var-lib-kubelet\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136471 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5-host-slash\") pod \"iptables-alerter-ck9lp\" (UID: \"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5\") " pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136497 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/012be36b-5039-4ab8-82de-702be926779a-host\") pod \"node-ca-xbhwz\" (UID: \"012be36b-5039-4ab8-82de-702be926779a\") " pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136526 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-device-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136557 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltlvg\" (UniqueName: \"kubernetes.io/projected/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-kube-api-access-ltlvg\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136589 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-run-multus-certs\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136618 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-var-lib-kubelet\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.137225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136649 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-run-ovn\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136678 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-run-k8s-cni-cncf-io\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136701 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-env-overrides\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136731 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-ovn-node-metrics-cert\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136768 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-cni-dir\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.135865 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-run-netns\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136800 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-run-netns\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136872 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-lib-modules\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136880 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-hostroot\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136918 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-system-cni-dir\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136952 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136977 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-sysconfig\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.136986 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-lib-modules\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137045 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-sysconfig\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137079 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137113 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137152 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-registration-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.138073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137183 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-slash\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137213 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-log-socket\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137280 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137309 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-var-lib-cni-bin\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137339 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-system-cni-dir\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137369 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7ghk\" (UniqueName: \"kubernetes.io/projected/8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5-kube-api-access-c7ghk\") pod \"iptables-alerter-ck9lp\" (UID: \"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5\") " pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137399 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-run\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137432 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/012be36b-5039-4ab8-82de-702be926779a-serviceca\") pod \"node-ca-xbhwz\" (UID: \"012be36b-5039-4ab8-82de-702be926779a\") " pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137466 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-sys-fs\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137497 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-var-lib-openvswitch\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137528 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-cnibin\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137559 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-cnibin\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137588 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-modprobe-d\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137672 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-ovnkube-script-lib\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137733 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-modprobe-d\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137787 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-etc-openvswitch\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137856 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-run-ovn-kubernetes\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.138855 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137904 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-cni-netd\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.137950 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-var-lib-kubelet\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.138041 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.138127 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs podName:a0ce8326-ca5a-49b9-90c0-94db13c2c74e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.638097961 +0000 UTC m=+3.121246882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs") pod "network-metrics-daemon-hkqtk" (UID: "a0ce8326-ca5a-49b9-90c0-94db13c2c74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.138184 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-cni-binary-copy\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.138331 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.138392 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-registration-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.138439 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-slash\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.138983 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-tmp\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139115 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139189 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-run-k8s-cni-cncf-io\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139258 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5-host-slash\") pod \"iptables-alerter-ck9lp\" (UID: \"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5\") " pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139299 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8890691e-8929-4051-8642-9a4556f51961-agent-certs\") pod \"konnectivity-agent-h2cnc\" (UID: \"8890691e-8929-4051-8642-9a4556f51961\") " pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139303 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/012be36b-5039-4ab8-82de-702be926779a-host\") pod \"node-ca-xbhwz\" (UID: \"012be36b-5039-4ab8-82de-702be926779a\") " pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139354 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-device-dir\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139413 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-ovnkube-config\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139491 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-run-systemd\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.139619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139615 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-run-multus-certs\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.140471 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139677 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-var-lib-kubelet\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.140471 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139726 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-run-ovn\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.140471 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139741 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-env-overrides\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.140471 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139796 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-os-release\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.140471 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139935 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-systemd\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.140471 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.139969 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.140471 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140317 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/012be36b-5039-4ab8-82de-702be926779a-serviceca\") pod \"node-ca-xbhwz\" (UID: \"012be36b-5039-4ab8-82de-702be926779a\") " pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.140471 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140382 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.140471 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140435 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-var-lib-cni-bin\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.140941 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140477 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-host-run-netns\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.140941 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140490 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-system-cni-dir\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.140941 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140558 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-run\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.140941 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140722 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-cnibin\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.140941 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140728 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-cni-dir\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.140941 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140867 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-etc-selinux\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.140941 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140926 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-system-cni-dir\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.141257 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.140803 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87e620d6-c02b-44e6-897b-45e0488dc88a-multus-socket-dir-parent\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.141257 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.141014 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-log-socket\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.141257 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.141098 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/24c64247-d926-464e-86ae-324af71b98d9-sys-fs\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.141257 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.141193 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-var-lib-openvswitch\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.141524 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.141310 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-cnibin\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.142520 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.142497 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-etc-tuned\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.142631 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.142542 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-ovn-node-metrics-cert\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.144035 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.144015 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8890691e-8929-4051-8642-9a4556f51961-konnectivity-ca\") pod \"konnectivity-agent-h2cnc\" (UID: \"8890691e-8929-4051-8642-9a4556f51961\") " pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:01.153748 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.153701 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w748\" (UniqueName: \"kubernetes.io/projected/24c64247-d926-464e-86ae-324af71b98d9-kube-api-access-6w748\") pod \"aws-ebs-csi-driver-node-ljfx4\" (UID: \"24c64247-d926-464e-86ae-324af71b98d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.153915 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.153893 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58l5g\" (UniqueName: \"kubernetes.io/projected/5f9ec80f-6c8e-4cdf-989c-4c357a66efe8-kube-api-access-58l5g\") pod \"tuned-t9q6h\" (UID: \"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8\") " pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.155625 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.155604 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rfzl\" (UniqueName: \"kubernetes.io/projected/012be36b-5039-4ab8-82de-702be926779a-kube-api-access-5rfzl\") pod \"node-ca-xbhwz\" (UID: \"012be36b-5039-4ab8-82de-702be926779a\") " pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.157896 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.157877 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdqsr\" (UniqueName: \"kubernetes.io/projected/87e620d6-c02b-44e6-897b-45e0488dc88a-kube-api-access-jdqsr\") pod \"multus-6j2q7\" (UID: \"87e620d6-c02b-44e6-897b-45e0488dc88a\") " pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.159873 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.159852 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:01.159971 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.159881 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:01.159971 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.159895 2561 projected.go:194] Error preparing data for projected volume kube-api-access-lw4wh for pod openshift-network-diagnostics/network-check-target-jkgr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.160079 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.159975 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh podName:3e96a428-4bd2-4a4f-b624-974f68f14131 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.659954751 +0000 UTC m=+3.143103647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lw4wh" (UniqueName: "kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh") pod "network-check-target-jkgr6" (UID: "3e96a428-4bd2-4a4f-b624-974f68f14131") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.161578 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.161560 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwdpn\" (UniqueName: \"kubernetes.io/projected/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-kube-api-access-fwdpn\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:01.162112 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.162089 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltlvg\" (UniqueName: \"kubernetes.io/projected/c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c-kube-api-access-ltlvg\") pod \"multus-additional-cni-plugins-c96xs\" (UID: \"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c\") " pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.162420 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.162379 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7ghk\" (UniqueName: \"kubernetes.io/projected/8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5-kube-api-access-c7ghk\") pod \"iptables-alerter-ck9lp\" (UID: \"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5\") " pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.162873 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.162854 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnhqd\" (UniqueName: \"kubernetes.io/projected/c5e7c7d5-574f-4907-8f05-2b58d5c7118f-kube-api-access-xnhqd\") pod \"ovnkube-node-t5dmh\" (UID: \"c5e7c7d5-574f-4907-8f05-2b58d5c7118f\") " pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.238616 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.238530 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65540f57-d96d-4dcd-b7a1-010ff5e40cae-dbus\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.238616 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.238575 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.238616 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.238613 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65540f57-d96d-4dcd-b7a1-010ff5e40cae-kubelet-config\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.238859 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.238693 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/65540f57-d96d-4dcd-b7a1-010ff5e40cae-kubelet-config\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.238859 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.238733 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/65540f57-d96d-4dcd-b7a1-010ff5e40cae-dbus\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.238859 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.238746 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.238859 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.238851 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret podName:65540f57-d96d-4dcd-b7a1-010ff5e40cae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:01.738799708 +0000 UTC m=+3.221948610 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret") pod "global-pull-secret-syncer-dghfv" (UID: "65540f57-d96d-4dcd-b7a1-010ff5e40cae") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.319371 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.319339 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" Apr 16 19:54:01.324199 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.324174 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6j2q7" Apr 16 19:54:01.333664 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.333640 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c96xs" Apr 16 19:54:01.338154 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.338132 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ck9lp" Apr 16 19:54:01.344820 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.344802 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:01.351470 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.351453 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:01.358015 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.357994 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" Apr 16 19:54:01.362559 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.362542 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xbhwz" Apr 16 19:54:01.635694 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:01.635445 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24c64247_d926_464e_86ae_324af71b98d9.slice/crio-4eb2f06da2891c36077d0b8f90659d8699a47632dc089a8414624daa10cf163a WatchSource:0}: Error finding container 4eb2f06da2891c36077d0b8f90659d8699a47632dc089a8414624daa10cf163a: Status 404 returned error can't find the container with id 4eb2f06da2891c36077d0b8f90659d8699a47632dc089a8414624daa10cf163a Apr 16 19:54:01.636996 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:01.636873 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b791b2a_d9bd_4b03_8a90_5f7d696bf7d5.slice/crio-6648c5a930fed66f73a6cdef23649c50fa656e6def6ccd2a15bc08414ba967ac WatchSource:0}: Error finding container 6648c5a930fed66f73a6cdef23649c50fa656e6def6ccd2a15bc08414ba967ac: Status 404 returned error can't find the container with id 6648c5a930fed66f73a6cdef23649c50fa656e6def6ccd2a15bc08414ba967ac Apr 16 19:54:01.639340 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:01.639317 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e620d6_c02b_44e6_897b_45e0488dc88a.slice/crio-d433423f1b4b7116e70e8be88f5189664934a9b79e47dfb4d9cbdab8925f53b6 WatchSource:0}: Error finding container d433423f1b4b7116e70e8be88f5189664934a9b79e47dfb4d9cbdab8925f53b6: Status 404 returned error can't find the container with id d433423f1b4b7116e70e8be88f5189664934a9b79e47dfb4d9cbdab8925f53b6 Apr 16 19:54:01.640315 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:01.640288 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9ec80f_6c8e_4cdf_989c_4c357a66efe8.slice/crio-00e9ff5ec642589affa588855565ff5ffeef230f6cb5b5b940677cd249a0d7cc WatchSource:0}: Error finding container 00e9ff5ec642589affa588855565ff5ffeef230f6cb5b5b940677cd249a0d7cc: Status 404 returned error can't find the container with id 00e9ff5ec642589affa588855565ff5ffeef230f6cb5b5b940677cd249a0d7cc Apr 16 19:54:01.641172 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:01.641146 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod012be36b_5039_4ab8_82de_702be926779a.slice/crio-e87d05b6769ba7699d3f771b839902cecb9f2b143bc521541fa4602ee47065a4 WatchSource:0}: Error finding container e87d05b6769ba7699d3f771b839902cecb9f2b143bc521541fa4602ee47065a4: Status 404 returned error can't find the container with id e87d05b6769ba7699d3f771b839902cecb9f2b143bc521541fa4602ee47065a4 Apr 16 19:54:01.641408 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.641388 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:01.641567 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.641551 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.641631 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.641606 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs podName:a0ce8326-ca5a-49b9-90c0-94db13c2c74e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:02.641586963 +0000 UTC m=+4.124735873 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs") pod "network-metrics-daemon-hkqtk" (UID: "a0ce8326-ca5a-49b9-90c0-94db13c2c74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:01.641818 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:01.641747 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8890691e_8929_4051_8642_9a4556f51961.slice/crio-3357e5725f66cc818bf89902d1e09449767ca80d51cba266df42bfd8d051f0bd WatchSource:0}: Error finding container 3357e5725f66cc818bf89902d1e09449767ca80d51cba266df42bfd8d051f0bd: Status 404 returned error can't find the container with id 3357e5725f66cc818bf89902d1e09449767ca80d51cba266df42bfd8d051f0bd Apr 16 19:54:01.642973 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:01.642896 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7bfe2d4_039f_4e5e_bd3c_5295e58ee27c.slice/crio-98eac5fdfdf2b59f40c65d612ddcb26898a44762e8e80fbfa1b6ca0d1b7d22be WatchSource:0}: Error finding container 98eac5fdfdf2b59f40c65d612ddcb26898a44762e8e80fbfa1b6ca0d1b7d22be: Status 404 returned error can't find the container with id 98eac5fdfdf2b59f40c65d612ddcb26898a44762e8e80fbfa1b6ca0d1b7d22be Apr 16 19:54:01.644772 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:01.644702 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e7c7d5_574f_4907_8f05_2b58d5c7118f.slice/crio-79ce842ab10b0366ee442389149cd3ed190cea3fdc64f4b6441a3e909206b36d WatchSource:0}: Error finding container 79ce842ab10b0366ee442389149cd3ed190cea3fdc64f4b6441a3e909206b36d: Status 404 returned error can't find the container with id 79ce842ab10b0366ee442389149cd3ed190cea3fdc64f4b6441a3e909206b36d Apr 16 19:54:01.741858 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.741814 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:01.741969 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:01.741887 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:01.741969 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.741942 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:01.741969 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.741961 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:01.742067 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.741972 2561 projected.go:194] Error preparing data for projected volume kube-api-access-lw4wh for pod openshift-network-diagnostics/network-check-target-jkgr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.742067 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.741989 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:01.742067 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.742023 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh podName:3e96a428-4bd2-4a4f-b624-974f68f14131 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:02.74200647 +0000 UTC m=+4.225155370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lw4wh" (UniqueName: "kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh") pod "network-check-target-jkgr6" (UID: "3e96a428-4bd2-4a4f-b624-974f68f14131") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:01.742067 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:01.742038 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret podName:65540f57-d96d-4dcd-b7a1-010ff5e40cae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:02.742031435 +0000 UTC m=+4.225180331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret") pod "global-pull-secret-syncer-dghfv" (UID: "65540f57-d96d-4dcd-b7a1-010ff5e40cae") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:02.067673 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.067625 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:49:00 +0000 UTC" deadline="2027-12-12 18:20:57.595812705 +0000 UTC" Apr 16 19:54:02.067673 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.067662 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14518h26m55.528154448s" Apr 16 19:54:02.162507 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.162468 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" event={"ID":"24c64247-d926-464e-86ae-324af71b98d9","Type":"ContainerStarted","Data":"4eb2f06da2891c36077d0b8f90659d8699a47632dc089a8414624daa10cf163a"} Apr 16 19:54:02.166901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.166846 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" event={"ID":"c5e7c7d5-574f-4907-8f05-2b58d5c7118f","Type":"ContainerStarted","Data":"79ce842ab10b0366ee442389149cd3ed190cea3fdc64f4b6441a3e909206b36d"} Apr 16 19:54:02.168440 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.168415 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xbhwz" event={"ID":"012be36b-5039-4ab8-82de-702be926779a","Type":"ContainerStarted","Data":"e87d05b6769ba7699d3f771b839902cecb9f2b143bc521541fa4602ee47065a4"} Apr 16 19:54:02.170913 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.170500 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6j2q7" event={"ID":"87e620d6-c02b-44e6-897b-45e0488dc88a","Type":"ContainerStarted","Data":"d433423f1b4b7116e70e8be88f5189664934a9b79e47dfb4d9cbdab8925f53b6"} Apr 16 19:54:02.172105 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.172081 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ck9lp" event={"ID":"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5","Type":"ContainerStarted","Data":"6648c5a930fed66f73a6cdef23649c50fa656e6def6ccd2a15bc08414ba967ac"} Apr 16 19:54:02.177736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.177713 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal" event={"ID":"3a0a5c69e63ac560ca3e107bf62451fd","Type":"ContainerStarted","Data":"80552f55b459fc6c5bcac92967e89519ae4cf23af4ace7c461efcaab7e3f92f5"} Apr 16 19:54:02.180128 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.180108 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c96xs" event={"ID":"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c","Type":"ContainerStarted","Data":"98eac5fdfdf2b59f40c65d612ddcb26898a44762e8e80fbfa1b6ca0d1b7d22be"} Apr 16 19:54:02.182425 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.182401 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h2cnc" event={"ID":"8890691e-8929-4051-8642-9a4556f51961","Type":"ContainerStarted","Data":"3357e5725f66cc818bf89902d1e09449767ca80d51cba266df42bfd8d051f0bd"} Apr 16 19:54:02.185299 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.185273 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" event={"ID":"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8","Type":"ContainerStarted","Data":"00e9ff5ec642589affa588855565ff5ffeef230f6cb5b5b940677cd249a0d7cc"} Apr 16 19:54:02.649756 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.649725 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:02.649929 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:02.649905 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:02.650006 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:02.649974 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs podName:a0ce8326-ca5a-49b9-90c0-94db13c2c74e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.649955633 +0000 UTC m=+6.133104536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs") pod "network-metrics-daemon-hkqtk" (UID: "a0ce8326-ca5a-49b9-90c0-94db13c2c74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:02.750434 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.750397 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:02.750624 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.750469 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:02.750624 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:02.750594 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:02.750733 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:02.750654 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret podName:65540f57-d96d-4dcd-b7a1-010ff5e40cae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.750636246 +0000 UTC m=+6.233785163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret") pod "global-pull-secret-syncer-dghfv" (UID: "65540f57-d96d-4dcd-b7a1-010ff5e40cae") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:02.751078 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:02.751059 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:02.751078 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:02.751081 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:02.751240 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:02.751093 2561 projected.go:194] Error preparing data for projected volume kube-api-access-lw4wh for pod openshift-network-diagnostics/network-check-target-jkgr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:02.751240 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:02.751141 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh podName:3e96a428-4bd2-4a4f-b624-974f68f14131 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:04.751126425 +0000 UTC m=+6.234275328 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lw4wh" (UniqueName: "kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh") pod "network-check-target-jkgr6" (UID: "3e96a428-4bd2-4a4f-b624-974f68f14131") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:02.925703 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:02.925418 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:54:03.150358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:03.150283 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:03.150358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:03.150358 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:03.150908 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:03.150475 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:03.150908 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:03.150807 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:03.151028 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:03.150912 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:03.151028 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:03.150995 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:03.209395 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:03.209308 2561 generic.go:358] "Generic (PLEG): container finished" podID="fc485f6a70ac1e219db7780dcfe225f6" containerID="244e9fdeadc430dcf4d5aa2ded7353778f55bb1ca324d0cbe64d99e4b5b449b7" exitCode=0 Apr 16 19:54:03.209564 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:03.209480 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" event={"ID":"fc485f6a70ac1e219db7780dcfe225f6","Type":"ContainerDied","Data":"244e9fdeadc430dcf4d5aa2ded7353778f55bb1ca324d0cbe64d99e4b5b449b7"} Apr 16 19:54:03.228768 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:03.228421 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-164.ec2.internal" podStartSLOduration=3.22840366 podStartE2EDuration="3.22840366s" podCreationTimestamp="2026-04-16 19:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:02.191016631 +0000 UTC m=+3.674165554" watchObservedRunningTime="2026-04-16 19:54:03.22840366 +0000 UTC m=+4.711552581" Apr 16 19:54:04.218026 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:04.217942 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" event={"ID":"fc485f6a70ac1e219db7780dcfe225f6","Type":"ContainerStarted","Data":"9797a855c6fbdd1c320bf29faa6adad3978b9705565f3a07b580138ab7067e72"} Apr 16 19:54:04.236495 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:04.236434 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-164.ec2.internal" podStartSLOduration=4.23641569 podStartE2EDuration="4.23641569s" podCreationTimestamp="2026-04-16 19:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:04.235982868 +0000 UTC m=+5.719131787" watchObservedRunningTime="2026-04-16 19:54:04.23641569 +0000 UTC m=+5.719564613" Apr 16 19:54:04.668190 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:04.668157 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:04.668365 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:04.668325 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:04.668427 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:04.668384 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs podName:a0ce8326-ca5a-49b9-90c0-94db13c2c74e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:08.668366158 +0000 UTC m=+10.151515063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs") pod "network-metrics-daemon-hkqtk" (UID: "a0ce8326-ca5a-49b9-90c0-94db13c2c74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:04.769376 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:04.769342 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:04.769557 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:04.769419 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:04.769557 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:04.769546 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:04.769667 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:04.769606 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret podName:65540f57-d96d-4dcd-b7a1-010ff5e40cae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:08.76958688 +0000 UTC m=+10.252735779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret") pod "global-pull-secret-syncer-dghfv" (UID: "65540f57-d96d-4dcd-b7a1-010ff5e40cae") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:04.770036 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:04.770014 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:04.770036 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:04.770039 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:04.770192 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:04.770052 2561 projected.go:194] Error preparing data for projected volume kube-api-access-lw4wh for pod openshift-network-diagnostics/network-check-target-jkgr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:04.770192 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:04.770099 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh podName:3e96a428-4bd2-4a4f-b624-974f68f14131 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:08.770084083 +0000 UTC m=+10.253232983 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lw4wh" (UniqueName: "kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh") pod "network-check-target-jkgr6" (UID: "3e96a428-4bd2-4a4f-b624-974f68f14131") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:05.150063 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:05.149990 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:05.150063 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:05.150038 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:05.150278 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:05.150116 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:05.150278 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:05.150183 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:05.150389 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:05.150319 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:05.150442 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:05.150406 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:07.150058 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:07.150022 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:07.150058 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:07.150052 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:07.150570 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:07.150027 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:07.150570 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:07.150164 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:07.150570 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:07.150241 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:07.150570 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:07.150332 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:08.703524 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:08.703482 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:08.704014 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:08.703652 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:08.704014 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:08.703736 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs podName:a0ce8326-ca5a-49b9-90c0-94db13c2c74e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.703712939 +0000 UTC m=+18.186861850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs") pod "network-metrics-daemon-hkqtk" (UID: "a0ce8326-ca5a-49b9-90c0-94db13c2c74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:08.804012 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:08.803973 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:08.804172 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:08.804058 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:08.804243 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:08.804184 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:08.804294 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:08.804266 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret podName:65540f57-d96d-4dcd-b7a1-010ff5e40cae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.804244394 +0000 UTC m=+18.287393315 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret") pod "global-pull-secret-syncer-dghfv" (UID: "65540f57-d96d-4dcd-b7a1-010ff5e40cae") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:08.804294 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:08.804195 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:08.804409 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:08.804301 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:08.804409 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:08.804313 2561 projected.go:194] Error preparing data for projected volume kube-api-access-lw4wh for pod openshift-network-diagnostics/network-check-target-jkgr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:08.804409 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:08.804355 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh podName:3e96a428-4bd2-4a4f-b624-974f68f14131 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:16.804343198 +0000 UTC m=+18.287492095 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lw4wh" (UniqueName: "kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh") pod "network-check-target-jkgr6" (UID: "3e96a428-4bd2-4a4f-b624-974f68f14131") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:09.151387 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:09.151312 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:09.151544 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:09.151427 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:09.151544 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:09.151480 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:09.151657 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:09.151548 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:09.151942 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:09.151920 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:09.152043 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:09.152013 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:11.150892 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:11.150862 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:11.151347 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:11.150852 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:11.151347 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:11.150976 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:11.151347 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:11.150878 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:11.151347 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:11.151043 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:11.151347 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:11.151152 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:13.150748 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:13.150709 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:13.151203 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:13.150848 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:13.151203 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:13.150853 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:13.151203 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:13.150873 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:13.151203 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:13.150961 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:13.151203 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:13.151048 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:14.706009 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.705974 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bz8k8"] Apr 16 19:54:14.726392 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.726366 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:14.728402 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.728342 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:54:14.728402 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.728339 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:54:14.728616 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.728501 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wgp9m\"" Apr 16 19:54:14.849192 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.849157 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cba09151-5332-4eae-8b38-4f1cfe937ce5-tmp-dir\") pod \"node-resolver-bz8k8\" (UID: \"cba09151-5332-4eae-8b38-4f1cfe937ce5\") " pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:14.849374 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.849205 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chwtm\" (UniqueName: \"kubernetes.io/projected/cba09151-5332-4eae-8b38-4f1cfe937ce5-kube-api-access-chwtm\") pod \"node-resolver-bz8k8\" (UID: \"cba09151-5332-4eae-8b38-4f1cfe937ce5\") " pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:14.849374 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.849238 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cba09151-5332-4eae-8b38-4f1cfe937ce5-hosts-file\") pod \"node-resolver-bz8k8\" (UID: \"cba09151-5332-4eae-8b38-4f1cfe937ce5\") " pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:14.950628 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.950597 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cba09151-5332-4eae-8b38-4f1cfe937ce5-tmp-dir\") pod \"node-resolver-bz8k8\" (UID: \"cba09151-5332-4eae-8b38-4f1cfe937ce5\") " pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:14.950784 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.950633 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chwtm\" (UniqueName: \"kubernetes.io/projected/cba09151-5332-4eae-8b38-4f1cfe937ce5-kube-api-access-chwtm\") pod \"node-resolver-bz8k8\" (UID: \"cba09151-5332-4eae-8b38-4f1cfe937ce5\") " pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:14.950784 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.950665 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cba09151-5332-4eae-8b38-4f1cfe937ce5-hosts-file\") pod \"node-resolver-bz8k8\" (UID: \"cba09151-5332-4eae-8b38-4f1cfe937ce5\") " pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:14.950784 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.950777 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cba09151-5332-4eae-8b38-4f1cfe937ce5-hosts-file\") pod \"node-resolver-bz8k8\" (UID: \"cba09151-5332-4eae-8b38-4f1cfe937ce5\") " pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:14.951009 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.950988 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cba09151-5332-4eae-8b38-4f1cfe937ce5-tmp-dir\") pod \"node-resolver-bz8k8\" (UID: \"cba09151-5332-4eae-8b38-4f1cfe937ce5\") " pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:14.960706 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:14.960633 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chwtm\" (UniqueName: \"kubernetes.io/projected/cba09151-5332-4eae-8b38-4f1cfe937ce5-kube-api-access-chwtm\") pod \"node-resolver-bz8k8\" (UID: \"cba09151-5332-4eae-8b38-4f1cfe937ce5\") " pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:15.038339 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:15.038304 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bz8k8" Apr 16 19:54:15.150419 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:15.150391 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:15.150583 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:15.150400 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:15.150583 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:15.150500 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:15.150701 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:15.150400 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:15.150701 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:15.150610 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:15.150790 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:15.150713 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:16.762658 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:16.762626 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:16.763119 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:16.762764 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:16.763119 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:16.762842 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs podName:a0ce8326-ca5a-49b9-90c0-94db13c2c74e nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.762809856 +0000 UTC m=+34.245958754 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs") pod "network-metrics-daemon-hkqtk" (UID: "a0ce8326-ca5a-49b9-90c0-94db13c2c74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:16.863679 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:16.863645 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:16.863854 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:16.863719 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:16.863854 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:16.863773 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:16.863854 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:16.863791 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:16.863854 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:16.863803 2561 projected.go:194] Error preparing data for projected volume kube-api-access-lw4wh for pod openshift-network-diagnostics/network-check-target-jkgr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:16.863854 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:16.863823 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:16.864074 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:16.863881 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh podName:3e96a428-4bd2-4a4f-b624-974f68f14131 nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.863862956 +0000 UTC m=+34.347011866 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lw4wh" (UniqueName: "kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh") pod "network-check-target-jkgr6" (UID: "3e96a428-4bd2-4a4f-b624-974f68f14131") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:16.864074 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:16.863897 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret podName:65540f57-d96d-4dcd-b7a1-010ff5e40cae nodeName:}" failed. No retries permitted until 2026-04-16 19:54:32.86389051 +0000 UTC m=+34.347039407 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret") pod "global-pull-secret-syncer-dghfv" (UID: "65540f57-d96d-4dcd-b7a1-010ff5e40cae") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:17.150929 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:17.150842 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:17.150929 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:17.150886 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:17.150929 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:17.150886 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:17.151192 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:17.150973 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:17.151192 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:17.151099 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:17.151304 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:17.151201 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:18.270697 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:18.270673 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba09151_5332_4eae_8b38_4f1cfe937ce5.slice/crio-409160c1a33f7af0efc678b71e070de9a7b4011bcc547a6049cfe8abc4d16b7d WatchSource:0}: Error finding container 409160c1a33f7af0efc678b71e070de9a7b4011bcc547a6049cfe8abc4d16b7d: Status 404 returned error can't find the container with id 409160c1a33f7af0efc678b71e070de9a7b4011bcc547a6049cfe8abc4d16b7d Apr 16 19:54:19.151800 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.151358 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:19.151981 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.151428 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:19.151981 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:19.151878 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:19.151981 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.151458 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:19.151981 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:19.151965 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:19.152177 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:19.152032 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:19.244158 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.244116 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bz8k8" event={"ID":"cba09151-5332-4eae-8b38-4f1cfe937ce5","Type":"ContainerStarted","Data":"fd044b784149271198fd1c1e6499f867cacc44d74c7e0d3095ee3e07fdc6fa65"} Apr 16 19:54:19.244308 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.244171 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bz8k8" event={"ID":"cba09151-5332-4eae-8b38-4f1cfe937ce5","Type":"ContainerStarted","Data":"409160c1a33f7af0efc678b71e070de9a7b4011bcc547a6049cfe8abc4d16b7d"} Apr 16 19:54:19.245846 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.245791 2561 generic.go:358] "Generic (PLEG): container finished" podID="c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c" containerID="36102eb21d8df612ff7c122685ec7912171873f421144f0a407852af2507ac7e" exitCode=0 Apr 16 19:54:19.245964 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.245894 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c96xs" event={"ID":"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c","Type":"ContainerDied","Data":"36102eb21d8df612ff7c122685ec7912171873f421144f0a407852af2507ac7e"} Apr 16 19:54:19.247506 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.247487 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h2cnc" event={"ID":"8890691e-8929-4051-8642-9a4556f51961","Type":"ContainerStarted","Data":"991bfdc5a84a6fc0e53947ded2912ccc705d7eaf0f2b365231505a508084687f"} Apr 16 19:54:19.250005 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.249763 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" event={"ID":"5f9ec80f-6c8e-4cdf-989c-4c357a66efe8","Type":"ContainerStarted","Data":"3069fd5a36d719b34240a0148d73e40a745562fc00744d95435e5c970e6bc978"} Apr 16 19:54:19.251458 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.251433 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" event={"ID":"24c64247-d926-464e-86ae-324af71b98d9","Type":"ContainerStarted","Data":"65c72fb7bca336cfa0400fa603db1acfd40c50c898ead19829d11d7222db1090"} Apr 16 19:54:19.254394 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.254368 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" event={"ID":"c5e7c7d5-574f-4907-8f05-2b58d5c7118f","Type":"ContainerStarted","Data":"91b227ad5cd33901cf5c7867ca3e2fa934b18b6717c44d36f64d2455d60d6c4c"} Apr 16 19:54:19.254476 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.254399 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" event={"ID":"c5e7c7d5-574f-4907-8f05-2b58d5c7118f","Type":"ContainerStarted","Data":"f01acb3ab5a2fde8351b62fa9285be21779d7f91e0e71dc34aef678526cdd0ea"} Apr 16 19:54:19.254476 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.254413 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" event={"ID":"c5e7c7d5-574f-4907-8f05-2b58d5c7118f","Type":"ContainerStarted","Data":"80dab1e966feba488f99e9cc119a3aea5493cccb96725e47a7650eb2dc85b67f"} Apr 16 19:54:19.254476 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.254427 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" event={"ID":"c5e7c7d5-574f-4907-8f05-2b58d5c7118f","Type":"ContainerStarted","Data":"e20967c1b3d720b3d9288e029cf0c1050e895f3fef5048dfd92231243d5a0c3c"} Apr 16 19:54:19.254476 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.254440 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" event={"ID":"c5e7c7d5-574f-4907-8f05-2b58d5c7118f","Type":"ContainerStarted","Data":"48407601be71042961945436af2506b3b3706a70be70f1301077d8706da73e06"} Apr 16 19:54:19.254476 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.254451 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" event={"ID":"c5e7c7d5-574f-4907-8f05-2b58d5c7118f","Type":"ContainerStarted","Data":"3e41f1313289bb3d64b6903c47bfa960a2725b6de1b3ec0e07f1faf9bd6d5348"} Apr 16 19:54:19.255756 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.255735 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xbhwz" event={"ID":"012be36b-5039-4ab8-82de-702be926779a","Type":"ContainerStarted","Data":"8bcf5720adec2d20f52cd828c0945785bdb6f0f8cc4f0a665fcd6ba40a0c8c6f"} Apr 16 19:54:19.257553 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.257529 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6j2q7" event={"ID":"87e620d6-c02b-44e6-897b-45e0488dc88a","Type":"ContainerStarted","Data":"44fe5fde17cd0494332bf937de6bd71b39cc812fb1acfff5d7ff397c43fcdc5e"} Apr 16 19:54:19.258489 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.258449 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bz8k8" podStartSLOduration=5.258435302 podStartE2EDuration="5.258435302s" podCreationTimestamp="2026-04-16 19:54:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:19.258074833 +0000 UTC m=+20.741223752" watchObservedRunningTime="2026-04-16 19:54:19.258435302 +0000 UTC m=+20.741584222" Apr 16 19:54:19.316877 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.316843 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h2cnc" podStartSLOduration=3.717449646 podStartE2EDuration="20.316813111s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:01.643402753 +0000 UTC m=+3.126551650" lastFinishedPulling="2026-04-16 19:54:18.242766214 +0000 UTC m=+19.725915115" observedRunningTime="2026-04-16 19:54:19.297814913 +0000 UTC m=+20.780963832" watchObservedRunningTime="2026-04-16 19:54:19.316813111 +0000 UTC m=+20.799962012" Apr 16 19:54:19.317405 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.317235 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6j2q7" podStartSLOduration=3.6295082069999998 podStartE2EDuration="20.317228988s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:01.641058128 +0000 UTC m=+3.124207027" lastFinishedPulling="2026-04-16 19:54:18.328778897 +0000 UTC m=+19.811927808" observedRunningTime="2026-04-16 19:54:19.31667752 +0000 UTC m=+20.799826439" watchObservedRunningTime="2026-04-16 19:54:19.317228988 +0000 UTC m=+20.800377907" Apr 16 19:54:19.334958 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.334913 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-t9q6h" podStartSLOduration=3.734659282 podStartE2EDuration="20.33481543s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:01.642614825 +0000 UTC m=+3.125763727" lastFinishedPulling="2026-04-16 19:54:18.242770977 +0000 UTC m=+19.725919875" observedRunningTime="2026-04-16 19:54:19.334676066 +0000 UTC m=+20.817824986" watchObservedRunningTime="2026-04-16 19:54:19.33481543 +0000 UTC m=+20.817964346" Apr 16 19:54:19.354753 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.354706 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xbhwz" podStartSLOduration=3.755211006 podStartE2EDuration="20.354690649s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:01.643231653 +0000 UTC m=+3.126380568" lastFinishedPulling="2026-04-16 19:54:18.242711311 +0000 UTC m=+19.725860211" observedRunningTime="2026-04-16 19:54:19.354257945 +0000 UTC m=+20.837406866" watchObservedRunningTime="2026-04-16 19:54:19.354690649 +0000 UTC m=+20.837839568" Apr 16 19:54:19.432219 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:19.432193 2561 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:54:20.094577 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:20.094480 2561 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:54:19.432214944Z","UUID":"d1e6f225-bf59-45a7-ac0a-6e6a6b799927","Handler":null,"Name":"","Endpoint":""} Apr 16 19:54:20.099225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:20.099178 2561 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:54:20.099225 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:20.099212 2561 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:54:20.260736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:20.260704 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ck9lp" event={"ID":"8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5","Type":"ContainerStarted","Data":"cedf812f1c4d94c1c045c495090cac3358c5a5c342a0e7b2e38f49f730ada27e"} Apr 16 19:54:20.263087 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:20.262631 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" event={"ID":"24c64247-d926-464e-86ae-324af71b98d9","Type":"ContainerStarted","Data":"bc26b0c198b2b458356b2769f18670a7ca6761c6e132fd46b22db19da917a4c8"} Apr 16 19:54:20.278701 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:20.278019 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-ck9lp" podStartSLOduration=4.674041104 podStartE2EDuration="21.27800065s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:01.638775735 +0000 UTC m=+3.121924632" lastFinishedPulling="2026-04-16 19:54:18.24273528 +0000 UTC m=+19.725884178" observedRunningTime="2026-04-16 19:54:20.277386175 +0000 UTC m=+21.760535095" watchObservedRunningTime="2026-04-16 19:54:20.27800065 +0000 UTC m=+21.761149569" Apr 16 19:54:21.150492 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:21.150415 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:21.151233 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:21.150415 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:21.151233 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:21.150541 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:21.151233 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:21.150663 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:21.151233 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:21.150415 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:21.151233 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:21.150804 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:21.267189 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:21.267152 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" event={"ID":"24c64247-d926-464e-86ae-324af71b98d9","Type":"ContainerStarted","Data":"e01f99222a033e62905cae7bae9da47ecc35af17377c67b57e0809ae6aeb64c7"} Apr 16 19:54:21.270507 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:21.270473 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" event={"ID":"c5e7c7d5-574f-4907-8f05-2b58d5c7118f","Type":"ContainerStarted","Data":"b58d8fea4aa848f0fc87deed1b43d3f2be5c7d0cf084c257652cfed5ddd97ac5"} Apr 16 19:54:21.285777 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:21.285734 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ljfx4" podStartSLOduration=3.689090997 podStartE2EDuration="22.285721256s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:01.63782045 +0000 UTC m=+3.120969353" lastFinishedPulling="2026-04-16 19:54:20.2344507 +0000 UTC m=+21.717599612" observedRunningTime="2026-04-16 19:54:21.285374969 +0000 UTC m=+22.768523910" watchObservedRunningTime="2026-04-16 19:54:21.285721256 +0000 UTC m=+22.768870174" Apr 16 19:54:23.150304 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.149995 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:23.150304 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.149995 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:23.151147 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:23.150334 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:23.151147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.149995 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:23.151147 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:23.150397 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:23.151147 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:23.150473 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:23.275955 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.275921 2561 generic.go:358] "Generic (PLEG): container finished" podID="c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c" containerID="905a954466696f114be898cfb62cf4a354c7d43710f8416dbba60e41d37fe74a" exitCode=0 Apr 16 19:54:23.276073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.275998 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c96xs" event={"ID":"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c","Type":"ContainerDied","Data":"905a954466696f114be898cfb62cf4a354c7d43710f8416dbba60e41d37fe74a"} Apr 16 19:54:23.281740 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.281703 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" event={"ID":"c5e7c7d5-574f-4907-8f05-2b58d5c7118f","Type":"ContainerStarted","Data":"268478ef93ad671a0d7eb182387bc2c8f139aab2ceb559a0049b865fb30f13f9"} Apr 16 19:54:23.282230 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.282215 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:23.282286 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.282237 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:23.297157 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.297131 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:23.326384 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:23.326336 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" podStartSLOduration=7.644775666 podStartE2EDuration="24.326322277s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:01.646790956 +0000 UTC m=+3.129939854" lastFinishedPulling="2026-04-16 19:54:18.328337555 +0000 UTC m=+19.811486465" observedRunningTime="2026-04-16 19:54:23.326115592 +0000 UTC m=+24.809264512" watchObservedRunningTime="2026-04-16 19:54:23.326322277 +0000 UTC m=+24.809471195" Apr 16 19:54:24.198677 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.198649 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:24.199252 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.199235 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:24.285577 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.285543 2561 generic.go:358] "Generic (PLEG): container finished" podID="c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c" containerID="dc053d361ba5766762a07732c08f1ead26430550944515ef8cfc790e5d288278" exitCode=0 Apr 16 19:54:24.285751 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.285664 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c96xs" event={"ID":"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c","Type":"ContainerDied","Data":"dc053d361ba5766762a07732c08f1ead26430550944515ef8cfc790e5d288278"} Apr 16 19:54:24.286421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.286411 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:24.303231 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.303200 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:54:24.992722 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.992572 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jkgr6"] Apr 16 19:54:24.992887 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.992868 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:24.992990 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:24.992971 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:24.995771 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.995743 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hkqtk"] Apr 16 19:54:24.995981 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:24.995963 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:24.996109 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:24.996085 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:25.006360 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:25.006258 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dghfv"] Apr 16 19:54:25.006447 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:25.006387 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:25.006504 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:25.006484 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:25.290319 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:25.290286 2561 generic.go:358] "Generic (PLEG): container finished" podID="c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c" containerID="38b7df5f895873d8cc86d1c0961ec85c2d9b284e4fcbeedcf35c278026f0a6b6" exitCode=0 Apr 16 19:54:25.291080 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:25.290367 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c96xs" event={"ID":"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c","Type":"ContainerDied","Data":"38b7df5f895873d8cc86d1c0961ec85c2d9b284e4fcbeedcf35c278026f0a6b6"} Apr 16 19:54:27.150498 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:27.150460 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:27.150925 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:27.150462 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:27.150925 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:27.150573 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:27.150925 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:27.150469 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:27.150925 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:27.150683 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:27.150925 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:27.150757 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:27.689203 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:27.689171 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:27.689352 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:27.689321 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:54:27.690745 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:27.690579 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h2cnc" Apr 16 19:54:28.517009 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:28.516986 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bz8k8_cba09151-5332-4eae-8b38-4f1cfe937ce5/dns-node-resolver/0.log" Apr 16 19:54:29.151390 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:29.151312 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:29.151390 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:29.151385 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:29.151594 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:29.151448 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:29.151594 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:29.151522 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:29.151594 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:29.151523 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:29.151813 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:29.151610 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:29.898279 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:29.898251 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xbhwz_012be36b-5039-4ab8-82de-702be926779a/node-ca/0.log" Apr 16 19:54:31.153398 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:31.153307 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:31.153788 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:31.153308 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:31.153788 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:31.153410 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hkqtk" podUID="a0ce8326-ca5a-49b9-90c0-94db13c2c74e" Apr 16 19:54:31.153788 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:31.153308 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:31.153788 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:31.153485 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dghfv" podUID="65540f57-d96d-4dcd-b7a1-010ff5e40cae" Apr 16 19:54:31.153788 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:31.153580 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jkgr6" podUID="3e96a428-4bd2-4a4f-b624-974f68f14131" Apr 16 19:54:31.380390 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:31.380361 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-164.ec2.internal" event="NodeReady" Apr 16 19:54:31.380533 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:31.380468 2561 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:54:32.307871 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:32.307837 2561 generic.go:358] "Generic (PLEG): container finished" podID="c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c" containerID="b66cea8cc8f25054dcb2792adc4060839d033c0da0c796aeddd0c77f535b96db" exitCode=0 Apr 16 19:54:32.308282 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:32.307902 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c96xs" event={"ID":"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c","Type":"ContainerDied","Data":"b66cea8cc8f25054dcb2792adc4060839d033c0da0c796aeddd0c77f535b96db"} Apr 16 19:54:32.784808 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:32.784776 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:32.784970 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:32.784925 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:32.785009 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:32.784986 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs podName:a0ce8326-ca5a-49b9-90c0-94db13c2c74e nodeName:}" failed. No retries permitted until 2026-04-16 19:55:04.784970637 +0000 UTC m=+66.268119538 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs") pod "network-metrics-daemon-hkqtk" (UID: "a0ce8326-ca5a-49b9-90c0-94db13c2c74e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:54:32.885751 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:32.885716 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:32.885922 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:32.885767 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:32.885922 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:32.885893 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:32.886044 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:32.885921 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:54:32.886044 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:32.885963 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:54:32.886044 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:32.885980 2561 projected.go:194] Error preparing data for projected volume kube-api-access-lw4wh for pod openshift-network-diagnostics/network-check-target-jkgr6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:32.886044 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:32.885996 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret podName:65540f57-d96d-4dcd-b7a1-010ff5e40cae nodeName:}" failed. No retries permitted until 2026-04-16 19:55:04.885982877 +0000 UTC m=+66.369131774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret") pod "global-pull-secret-syncer-dghfv" (UID: "65540f57-d96d-4dcd-b7a1-010ff5e40cae") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:54:32.886044 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:54:32.886031 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh podName:3e96a428-4bd2-4a4f-b624-974f68f14131 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:04.886014297 +0000 UTC m=+66.369163196 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-lw4wh" (UniqueName: "kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh") pod "network-check-target-jkgr6" (UID: "3e96a428-4bd2-4a4f-b624-974f68f14131") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:54:33.151009 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.150932 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:54:33.151009 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.150974 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:54:33.151284 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.150942 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:54:33.153546 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.153492 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:54:33.153546 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.153509 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:54:33.153546 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.153528 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:54:33.154214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.154199 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:54:33.154290 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.154200 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kmsjn\"" Apr 16 19:54:33.154290 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.154240 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t9ktw\"" Apr 16 19:54:33.311898 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.311867 2561 generic.go:358] "Generic (PLEG): container finished" podID="c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c" containerID="ecf99931cbfe01cab53c9b863ccc3bded5c97f1991f62701bff67f5dea6df55f" exitCode=0 Apr 16 19:54:33.312248 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:33.311928 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c96xs" event={"ID":"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c","Type":"ContainerDied","Data":"ecf99931cbfe01cab53c9b863ccc3bded5c97f1991f62701bff67f5dea6df55f"} Apr 16 19:54:34.316341 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:34.316310 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c96xs" event={"ID":"c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c","Type":"ContainerStarted","Data":"56aac7c9714249c64e6fec8c4318f0a9c0db68c0e003c783266ac343c3f673be"} Apr 16 19:54:34.341554 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:34.341507 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c96xs" podStartSLOduration=5.830030584 podStartE2EDuration="35.341494067s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:54:01.645516844 +0000 UTC m=+3.128665741" lastFinishedPulling="2026-04-16 19:54:31.156980323 +0000 UTC m=+32.640129224" observedRunningTime="2026-04-16 19:54:34.341167803 +0000 UTC m=+35.824316721" watchObservedRunningTime="2026-04-16 19:54:34.341494067 +0000 UTC m=+35.824642986" Apr 16 19:54:50.781940 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.781732 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-68b985c7d7-k2vd9"] Apr 16 19:54:50.784603 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.784587 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:50.787453 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.787428 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:54:50.787612 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.787576 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:54:50.787684 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.787640 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:54:50.790287 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.790256 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jx8h5\"" Apr 16 19:54:50.797686 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.797651 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:54:50.812215 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.812182 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-89xkw"] Apr 16 19:54:50.814995 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.814979 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68b985c7d7-k2vd9"] Apr 16 19:54:50.815094 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.815001 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-htqsk"] Apr 16 19:54:50.815170 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.815151 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:50.817771 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.817749 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-htqsk" Apr 16 19:54:50.818417 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.818398 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:54:50.818548 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.818463 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:54:50.818548 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.818483 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:54:50.818738 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.818723 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:54:50.818780 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.818758 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-676kc\"" Apr 16 19:54:50.819580 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.819554 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:54:50.819970 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.819956 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:54:50.820027 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.819966 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:54:50.820027 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.819956 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mmrlb\"" Apr 16 19:54:50.831701 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.831680 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-htqsk"] Apr 16 19:54:50.834439 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.834418 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-89xkw"] Apr 16 19:54:50.906260 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.906226 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pzd2h"] Apr 16 19:54:50.909436 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.909407 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:50.909585 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.909498 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8f97a0d-d529-41d9-b1dd-4aa61965968d-registry-certificates\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:50.909585 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.909527 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4txg\" (UniqueName: \"kubernetes.io/projected/f8f97a0d-d529-41d9-b1dd-4aa61965968d-kube-api-access-b4txg\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:50.909688 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.909609 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8f97a0d-d529-41d9-b1dd-4aa61965968d-trusted-ca\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:50.909688 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.909636 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8f97a0d-d529-41d9-b1dd-4aa61965968d-image-registry-private-configuration\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:50.909688 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.909655 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8f97a0d-d529-41d9-b1dd-4aa61965968d-bound-sa-token\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:50.909818 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.909709 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8f97a0d-d529-41d9-b1dd-4aa61965968d-installation-pull-secrets\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:50.909818 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.909786 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8f97a0d-d529-41d9-b1dd-4aa61965968d-registry-tls\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:50.909909 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.909845 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8f97a0d-d529-41d9-b1dd-4aa61965968d-ca-trust-extracted\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:50.911334 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.911316 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dknqq\"" Apr 16 19:54:50.912643 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.912621 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:54:50.912953 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.912940 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:54:50.924127 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:50.924109 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pzd2h"] Apr 16 19:54:51.011091 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011058 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8f97a0d-d529-41d9-b1dd-4aa61965968d-installation-pull-secrets\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.011091 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011094 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/854ee71f-1cd4-4645-9909-8f0da088e901-metrics-tls\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.011336 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011120 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9dabfc-3f32-488d-81d1-0da63416b50c-cert\") pod \"ingress-canary-htqsk\" (UID: \"3d9dabfc-3f32-488d-81d1-0da63416b50c\") " pod="openshift-ingress-canary/ingress-canary-htqsk" Apr 16 19:54:51.011336 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011176 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8f97a0d-d529-41d9-b1dd-4aa61965968d-registry-tls\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.011336 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011217 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8f97a0d-d529-41d9-b1dd-4aa61965968d-ca-trust-extracted\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.011485 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011353 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36f87ce1-84b3-4474-9bf8-1034549406d7-data-volume\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.011485 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011384 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36f87ce1-84b3-4474-9bf8-1034549406d7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.011485 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011415 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvlxk\" (UniqueName: \"kubernetes.io/projected/854ee71f-1cd4-4645-9909-8f0da088e901-kube-api-access-cvlxk\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.011485 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011438 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36f87ce1-84b3-4474-9bf8-1034549406d7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.011485 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011468 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8f97a0d-d529-41d9-b1dd-4aa61965968d-registry-certificates\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.011485 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011488 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4txg\" (UniqueName: \"kubernetes.io/projected/f8f97a0d-d529-41d9-b1dd-4aa61965968d-kube-api-access-b4txg\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.011777 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011507 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36f87ce1-84b3-4474-9bf8-1034549406d7-crio-socket\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.011777 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011588 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8f97a0d-d529-41d9-b1dd-4aa61965968d-trusted-ca\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.011777 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011613 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/854ee71f-1cd4-4645-9909-8f0da088e901-tmp-dir\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.011777 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011634 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbpbv\" (UniqueName: \"kubernetes.io/projected/3d9dabfc-3f32-488d-81d1-0da63416b50c-kube-api-access-hbpbv\") pod \"ingress-canary-htqsk\" (UID: \"3d9dabfc-3f32-488d-81d1-0da63416b50c\") " pod="openshift-ingress-canary/ingress-canary-htqsk" Apr 16 19:54:51.011777 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011663 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8f97a0d-d529-41d9-b1dd-4aa61965968d-image-registry-private-configuration\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.011777 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011682 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8f97a0d-d529-41d9-b1dd-4aa61965968d-ca-trust-extracted\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.011777 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011688 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8f97a0d-d529-41d9-b1dd-4aa61965968d-bound-sa-token\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.012096 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011905 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854ee71f-1cd4-4645-9909-8f0da088e901-config-volume\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.012096 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.011934 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xknwk\" (UniqueName: \"kubernetes.io/projected/36f87ce1-84b3-4474-9bf8-1034549406d7-kube-api-access-xknwk\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.012267 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.012249 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8f97a0d-d529-41d9-b1dd-4aa61965968d-registry-certificates\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.012869 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.012844 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8f97a0d-d529-41d9-b1dd-4aa61965968d-trusted-ca\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.015254 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.015230 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8f97a0d-d529-41d9-b1dd-4aa61965968d-image-registry-private-configuration\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.015374 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.015234 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8f97a0d-d529-41d9-b1dd-4aa61965968d-installation-pull-secrets\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.015374 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.015355 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8f97a0d-d529-41d9-b1dd-4aa61965968d-registry-tls\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.021981 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.021960 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4txg\" (UniqueName: \"kubernetes.io/projected/f8f97a0d-d529-41d9-b1dd-4aa61965968d-kube-api-access-b4txg\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.022782 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.022767 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8f97a0d-d529-41d9-b1dd-4aa61965968d-bound-sa-token\") pod \"image-registry-68b985c7d7-k2vd9\" (UID: \"f8f97a0d-d529-41d9-b1dd-4aa61965968d\") " pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.093605 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.093513 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:51.112846 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.112801 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlxk\" (UniqueName: \"kubernetes.io/projected/854ee71f-1cd4-4645-9909-8f0da088e901-kube-api-access-cvlxk\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.112990 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.112881 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36f87ce1-84b3-4474-9bf8-1034549406d7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.112990 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.112921 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36f87ce1-84b3-4474-9bf8-1034549406d7-crio-socket\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.113101 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.112997 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/854ee71f-1cd4-4645-9909-8f0da088e901-tmp-dir\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.113101 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113022 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbpbv\" (UniqueName: \"kubernetes.io/projected/3d9dabfc-3f32-488d-81d1-0da63416b50c-kube-api-access-hbpbv\") pod \"ingress-canary-htqsk\" (UID: \"3d9dabfc-3f32-488d-81d1-0da63416b50c\") " pod="openshift-ingress-canary/ingress-canary-htqsk" Apr 16 19:54:51.113101 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113067 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36f87ce1-84b3-4474-9bf8-1034549406d7-crio-socket\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.113233 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113124 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854ee71f-1cd4-4645-9909-8f0da088e901-config-volume\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.113288 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113252 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xknwk\" (UniqueName: \"kubernetes.io/projected/36f87ce1-84b3-4474-9bf8-1034549406d7-kube-api-access-xknwk\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.113337 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113291 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/854ee71f-1cd4-4645-9909-8f0da088e901-tmp-dir\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.113337 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113299 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/854ee71f-1cd4-4645-9909-8f0da088e901-metrics-tls\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.113438 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113340 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9dabfc-3f32-488d-81d1-0da63416b50c-cert\") pod \"ingress-canary-htqsk\" (UID: \"3d9dabfc-3f32-488d-81d1-0da63416b50c\") " pod="openshift-ingress-canary/ingress-canary-htqsk" Apr 16 19:54:51.113438 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113381 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36f87ce1-84b3-4474-9bf8-1034549406d7-data-volume\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.113438 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113419 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36f87ce1-84b3-4474-9bf8-1034549406d7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.113804 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113744 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36f87ce1-84b3-4474-9bf8-1034549406d7-data-volume\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.113944 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.113860 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854ee71f-1cd4-4645-9909-8f0da088e901-config-volume\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.114137 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.114108 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36f87ce1-84b3-4474-9bf8-1034549406d7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.115502 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.115479 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36f87ce1-84b3-4474-9bf8-1034549406d7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.115712 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.115696 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d9dabfc-3f32-488d-81d1-0da63416b50c-cert\") pod \"ingress-canary-htqsk\" (UID: \"3d9dabfc-3f32-488d-81d1-0da63416b50c\") " pod="openshift-ingress-canary/ingress-canary-htqsk" Apr 16 19:54:51.115754 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.115735 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/854ee71f-1cd4-4645-9909-8f0da088e901-metrics-tls\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.124918 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.124890 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvlxk\" (UniqueName: \"kubernetes.io/projected/854ee71f-1cd4-4645-9909-8f0da088e901-kube-api-access-cvlxk\") pod \"dns-default-pzd2h\" (UID: \"854ee71f-1cd4-4645-9909-8f0da088e901\") " pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.125557 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.125534 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbpbv\" (UniqueName: \"kubernetes.io/projected/3d9dabfc-3f32-488d-81d1-0da63416b50c-kube-api-access-hbpbv\") pod \"ingress-canary-htqsk\" (UID: \"3d9dabfc-3f32-488d-81d1-0da63416b50c\") " pod="openshift-ingress-canary/ingress-canary-htqsk" Apr 16 19:54:51.126569 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.126553 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xknwk\" (UniqueName: \"kubernetes.io/projected/36f87ce1-84b3-4474-9bf8-1034549406d7-kube-api-access-xknwk\") pod \"insights-runtime-extractor-89xkw\" (UID: \"36f87ce1-84b3-4474-9bf8-1034549406d7\") " pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.130455 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.130440 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-htqsk" Apr 16 19:54:51.217730 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.217701 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:51.277436 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.276672 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68b985c7d7-k2vd9"] Apr 16 19:54:51.325648 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.325609 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-htqsk"] Apr 16 19:54:51.336081 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:51.336046 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d9dabfc_3f32_488d_81d1_0da63416b50c.slice/crio-1d2f917ab9857bf775a1c57e1856b439d215ecf8ea378c7311ec74d2118b77db WatchSource:0}: Error finding container 1d2f917ab9857bf775a1c57e1856b439d215ecf8ea378c7311ec74d2118b77db: Status 404 returned error can't find the container with id 1d2f917ab9857bf775a1c57e1856b439d215ecf8ea378c7311ec74d2118b77db Apr 16 19:54:51.360724 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.360691 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" event={"ID":"f8f97a0d-d529-41d9-b1dd-4aa61965968d","Type":"ContainerStarted","Data":"c1857054dc1b1c30e512cda1418ae35e2ca04511e82dcf2ae2f392ccc22acfaa"} Apr 16 19:54:51.361643 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.361618 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-htqsk" event={"ID":"3d9dabfc-3f32-488d-81d1-0da63416b50c","Type":"ContainerStarted","Data":"1d2f917ab9857bf775a1c57e1856b439d215ecf8ea378c7311ec74d2118b77db"} Apr 16 19:54:51.367785 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.367762 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pzd2h"] Apr 16 19:54:51.370050 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:51.370027 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854ee71f_1cd4_4645_9909_8f0da088e901.slice/crio-7f94a5bf0d3ad5c9cf1341102c9fd1c35eda8d073c9fb1ab282e7b2ddf04350a WatchSource:0}: Error finding container 7f94a5bf0d3ad5c9cf1341102c9fd1c35eda8d073c9fb1ab282e7b2ddf04350a: Status 404 returned error can't find the container with id 7f94a5bf0d3ad5c9cf1341102c9fd1c35eda8d073c9fb1ab282e7b2ddf04350a Apr 16 19:54:51.424708 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.424681 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-89xkw" Apr 16 19:54:51.553557 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:54:51.553525 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36f87ce1_84b3_4474_9bf8_1034549406d7.slice/crio-e7dba2f7474341ddb0694bbaf46a297eaeb06d2d7ab88cb01437402c370ce9e3 WatchSource:0}: Error finding container e7dba2f7474341ddb0694bbaf46a297eaeb06d2d7ab88cb01437402c370ce9e3: Status 404 returned error can't find the container with id e7dba2f7474341ddb0694bbaf46a297eaeb06d2d7ab88cb01437402c370ce9e3 Apr 16 19:54:51.556151 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:51.555847 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-89xkw"] Apr 16 19:54:52.366060 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:52.365943 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" event={"ID":"f8f97a0d-d529-41d9-b1dd-4aa61965968d","Type":"ContainerStarted","Data":"e2e2c9e1aeff97caee7cf05b91c17cb3fd6cb9df59cc93c9dade17e3a5c9abaf"} Apr 16 19:54:52.366548 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:52.366522 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:54:52.367778 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:52.367750 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89xkw" event={"ID":"36f87ce1-84b3-4474-9bf8-1034549406d7","Type":"ContainerStarted","Data":"a6ae09cbcca3aba9dc6e5047f987dbd10b45fed5900ee913d36d600b27e38053"} Apr 16 19:54:52.367900 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:52.367786 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89xkw" event={"ID":"36f87ce1-84b3-4474-9bf8-1034549406d7","Type":"ContainerStarted","Data":"b10d2f94ec9236a6b72cc5917740f522aa9451f47eb73d8c2b5b41f9d0fd6626"} Apr 16 19:54:52.367900 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:52.367800 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89xkw" event={"ID":"36f87ce1-84b3-4474-9bf8-1034549406d7","Type":"ContainerStarted","Data":"e7dba2f7474341ddb0694bbaf46a297eaeb06d2d7ab88cb01437402c370ce9e3"} Apr 16 19:54:52.368963 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:52.368937 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pzd2h" event={"ID":"854ee71f-1cd4-4645-9909-8f0da088e901","Type":"ContainerStarted","Data":"7f94a5bf0d3ad5c9cf1341102c9fd1c35eda8d073c9fb1ab282e7b2ddf04350a"} Apr 16 19:54:52.434901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:52.434790 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" podStartSLOduration=2.434775518 podStartE2EDuration="2.434775518s" podCreationTimestamp="2026-04-16 19:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:54:52.433890124 +0000 UTC m=+53.917039044" watchObservedRunningTime="2026-04-16 19:54:52.434775518 +0000 UTC m=+53.917924437" Apr 16 19:54:53.374477 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:53.374440 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-htqsk" event={"ID":"3d9dabfc-3f32-488d-81d1-0da63416b50c","Type":"ContainerStarted","Data":"c1784d520cdcdfb620a51ec39c7cd9f0e3af054cac93c82f46084b93befc93c3"} Apr 16 19:54:53.377125 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:53.377072 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pzd2h" event={"ID":"854ee71f-1cd4-4645-9909-8f0da088e901","Type":"ContainerStarted","Data":"79693369f6cb3816bec1f6e8d8cc702619946206e8981936784d9c86d5e83020"} Apr 16 19:54:54.381597 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:54.381560 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pzd2h" event={"ID":"854ee71f-1cd4-4645-9909-8f0da088e901","Type":"ContainerStarted","Data":"4c58be561c216637e6e7efbcc70e2e6f17b618dd25c17d0dda09140caa071bf4"} Apr 16 19:54:54.403901 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:54.403851 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-htqsk" podStartSLOduration=2.5473188799999997 podStartE2EDuration="4.403818828s" podCreationTimestamp="2026-04-16 19:54:50 +0000 UTC" firstStartedPulling="2026-04-16 19:54:51.338192739 +0000 UTC m=+52.821341637" lastFinishedPulling="2026-04-16 19:54:53.194692688 +0000 UTC m=+54.677841585" observedRunningTime="2026-04-16 19:54:53.39394947 +0000 UTC m=+54.877098404" watchObservedRunningTime="2026-04-16 19:54:54.403818828 +0000 UTC m=+55.886967747" Apr 16 19:54:54.404042 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:54.403962 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pzd2h" podStartSLOduration=2.585172278 podStartE2EDuration="4.403956276s" podCreationTimestamp="2026-04-16 19:54:50 +0000 UTC" firstStartedPulling="2026-04-16 19:54:51.371741106 +0000 UTC m=+52.854890007" lastFinishedPulling="2026-04-16 19:54:53.190525103 +0000 UTC m=+54.673674005" observedRunningTime="2026-04-16 19:54:54.400453502 +0000 UTC m=+55.883602421" watchObservedRunningTime="2026-04-16 19:54:54.403956276 +0000 UTC m=+55.887105202" Apr 16 19:54:55.386730 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:55.386691 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-89xkw" event={"ID":"36f87ce1-84b3-4474-9bf8-1034549406d7","Type":"ContainerStarted","Data":"67a45bdbbbf5d972f5b0d44f2c5ea5d6421adbda25822500b35ee20756e6e407"} Apr 16 19:54:55.387094 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:55.386981 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pzd2h" Apr 16 19:54:55.404890 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:55.404819 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-89xkw" podStartSLOduration=2.393370617 podStartE2EDuration="5.404806602s" podCreationTimestamp="2026-04-16 19:54:50 +0000 UTC" firstStartedPulling="2026-04-16 19:54:51.613732206 +0000 UTC m=+53.096881103" lastFinishedPulling="2026-04-16 19:54:54.625168187 +0000 UTC m=+56.108317088" observedRunningTime="2026-04-16 19:54:55.403784508 +0000 UTC m=+56.886933428" watchObservedRunningTime="2026-04-16 19:54:55.404806602 +0000 UTC m=+56.887955520" Apr 16 19:54:56.303812 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:54:56.303782 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t5dmh" Apr 16 19:55:01.328724 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.328692 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-spv5g"] Apr 16 19:55:01.366005 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.365966 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.371096 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.371073 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:55:01.371201 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.371112 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:55:01.371201 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.371167 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:55:01.371333 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.371318 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:55:01.371392 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.371357 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6xp24\"" Apr 16 19:55:01.371446 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.371391 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:55:01.372278 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.372255 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:55:01.373367 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.373347 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5549q"] Apr 16 19:55:01.382795 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.382782 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.385028 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.385009 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 19:55:01.385124 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.385092 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-zf2pl\"" Apr 16 19:55:01.385419 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.385396 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:55:01.387212 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.387191 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-metrics-client-ca\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.387310 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.387252 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-textfile\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.387310 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.387280 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-wtmp\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.387401 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.387310 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-tls\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.387401 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.387383 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhc7w\" (UniqueName: \"kubernetes.io/projected/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-kube-api-access-hhc7w\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.387499 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.387435 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-sys\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.387499 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.387489 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.387588 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.387540 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-root\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.387631 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.387598 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-accelerators-collector-config\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.389522 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.389503 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5549q"] Apr 16 19:55:01.487885 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.487851 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-accelerators-collector-config\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.487885 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.487887 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-metrics-client-ca\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488124 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.487920 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-textfile\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488124 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488038 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3e122c2-a77f-466b-b593-7d6143d118cd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.488124 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488074 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-wtmp\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488124 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488104 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-tls\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488324 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488153 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2dk\" (UniqueName: \"kubernetes.io/projected/b3e122c2-a77f-466b-b593-7d6143d118cd-kube-api-access-6f2dk\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.488324 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488206 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhc7w\" (UniqueName: \"kubernetes.io/projected/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-kube-api-access-hhc7w\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488324 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488232 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-textfile\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488324 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488234 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3e122c2-a77f-466b-b593-7d6143d118cd-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.488324 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:55:01.488281 2561 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 19:55:01.488324 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488291 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-sys\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488324 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488235 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-wtmp\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488324 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488323 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3e122c2-a77f-466b-b593-7d6143d118cd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.488736 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:55:01.488352 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-tls podName:22ae2e7e-371a-42e0-a0a9-0a3dc249db95 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:01.98833251 +0000 UTC m=+63.471481408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-tls") pod "node-exporter-spv5g" (UID: "22ae2e7e-371a-42e0-a0a9-0a3dc249db95") : secret "node-exporter-tls" not found Apr 16 19:55:01.488736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488362 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-sys\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488389 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488417 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-root\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488462 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-root\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488493 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-metrics-client-ca\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.488736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.488616 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-accelerators-collector-config\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.490457 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.490441 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.497073 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.497052 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhc7w\" (UniqueName: \"kubernetes.io/projected/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-kube-api-access-hhc7w\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.589357 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.589292 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3e122c2-a77f-466b-b593-7d6143d118cd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.589357 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.589340 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2dk\" (UniqueName: \"kubernetes.io/projected/b3e122c2-a77f-466b-b593-7d6143d118cd-kube-api-access-6f2dk\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.589548 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.589393 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3e122c2-a77f-466b-b593-7d6143d118cd-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.589548 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.589448 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3e122c2-a77f-466b-b593-7d6143d118cd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.590067 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.590046 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3e122c2-a77f-466b-b593-7d6143d118cd-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.591669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.591649 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b3e122c2-a77f-466b-b593-7d6143d118cd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.591774 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.591754 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b3e122c2-a77f-466b-b593-7d6143d118cd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.597522 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.597505 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2dk\" (UniqueName: \"kubernetes.io/projected/b3e122c2-a77f-466b-b593-7d6143d118cd-kube-api-access-6f2dk\") pod \"openshift-state-metrics-9d44df66c-5549q\" (UID: \"b3e122c2-a77f-466b-b593-7d6143d118cd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.691230 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.691201 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" Apr 16 19:55:01.814954 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.814927 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5549q"] Apr 16 19:55:01.818196 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:55:01.818170 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3e122c2_a77f_466b_b593_7d6143d118cd.slice/crio-e003ea09b4fa71bb1e66fd43e753791321482ad355c1d481f94ec9bedd0c2964 WatchSource:0}: Error finding container e003ea09b4fa71bb1e66fd43e753791321482ad355c1d481f94ec9bedd0c2964: Status 404 returned error can't find the container with id e003ea09b4fa71bb1e66fd43e753791321482ad355c1d481f94ec9bedd0c2964 Apr 16 19:55:01.992681 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.992655 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-tls\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:01.994786 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:01.994759 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/22ae2e7e-371a-42e0-a0a9-0a3dc249db95-node-exporter-tls\") pod \"node-exporter-spv5g\" (UID: \"22ae2e7e-371a-42e0-a0a9-0a3dc249db95\") " pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:02.014822 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.014800 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9689c5b7d-7k6p7"] Apr 16 19:55:02.025938 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.025915 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.026043 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.025983 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9689c5b7d-7k6p7"] Apr 16 19:55:02.028619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.028044 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 19:55:02.028619 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.028496 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:55:02.028766 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.028645 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:55:02.028766 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.028720 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7f55r\"" Apr 16 19:55:02.028923 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.028881 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 19:55:02.028923 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.028881 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 19:55:02.029142 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.029124 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 19:55:02.029231 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.029215 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 19:55:02.093385 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.093353 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-oauth-config\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.093385 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.093387 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-serving-cert\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.093591 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.093422 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-console-config\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.093591 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.093500 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72z4\" (UniqueName: \"kubernetes.io/projected/4775e95e-4763-42b5-ad41-20316a962a92-kube-api-access-f72z4\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.093591 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.093551 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-service-ca\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.093706 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.093606 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-oauth-serving-cert\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.194159 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.194122 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-oauth-serving-cert\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.194345 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.194169 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-oauth-config\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.194345 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.194185 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-serving-cert\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.194345 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.194322 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-console-config\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.194502 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.194363 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f72z4\" (UniqueName: \"kubernetes.io/projected/4775e95e-4763-42b5-ad41-20316a962a92-kube-api-access-f72z4\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.194502 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.194418 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-service-ca\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.194982 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.194961 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-service-ca\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.194982 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.194972 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-console-config\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.196554 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.196538 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-serving-cert\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.196605 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.196582 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-oauth-config\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.202268 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.202251 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72z4\" (UniqueName: \"kubernetes.io/projected/4775e95e-4763-42b5-ad41-20316a962a92-kube-api-access-f72z4\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.209657 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.209632 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-oauth-serving-cert\") pod \"console-9689c5b7d-7k6p7\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.274669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.274636 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-spv5g" Apr 16 19:55:02.282028 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:55:02.282005 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ae2e7e_371a_42e0_a0a9_0a3dc249db95.slice/crio-290e5ccbb49e53e2c39579a7606c1a53c5d2541e0a31d3799910eddb290e01cd WatchSource:0}: Error finding container 290e5ccbb49e53e2c39579a7606c1a53c5d2541e0a31d3799910eddb290e01cd: Status 404 returned error can't find the container with id 290e5ccbb49e53e2c39579a7606c1a53c5d2541e0a31d3799910eddb290e01cd Apr 16 19:55:02.335594 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.335567 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:02.406019 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.405906 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" event={"ID":"b3e122c2-a77f-466b-b593-7d6143d118cd","Type":"ContainerStarted","Data":"dae4d1f70c35ca937d8ce2f67463b9b9bdb045737be42260a840832bc6619574"} Apr 16 19:55:02.406019 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.405945 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" event={"ID":"b3e122c2-a77f-466b-b593-7d6143d118cd","Type":"ContainerStarted","Data":"a354fbbb12495a0a7f2d277ed217dcaff76ca2e1893385520b3e6ba6302b58fc"} Apr 16 19:55:02.406019 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.405957 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" event={"ID":"b3e122c2-a77f-466b-b593-7d6143d118cd","Type":"ContainerStarted","Data":"e003ea09b4fa71bb1e66fd43e753791321482ad355c1d481f94ec9bedd0c2964"} Apr 16 19:55:02.407782 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.407753 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-spv5g" event={"ID":"22ae2e7e-371a-42e0-a0a9-0a3dc249db95","Type":"ContainerStarted","Data":"290e5ccbb49e53e2c39579a7606c1a53c5d2541e0a31d3799910eddb290e01cd"} Apr 16 19:55:02.451289 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:02.451078 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9689c5b7d-7k6p7"] Apr 16 19:55:02.455445 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:55:02.455413 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4775e95e_4763_42b5_ad41_20316a962a92.slice/crio-ad1b790d6586d2f6f388770a5b4e519190a1ecc84106226cfe5b2f9dfb4572fd WatchSource:0}: Error finding container ad1b790d6586d2f6f388770a5b4e519190a1ecc84106226cfe5b2f9dfb4572fd: Status 404 returned error can't find the container with id ad1b790d6586d2f6f388770a5b4e519190a1ecc84106226cfe5b2f9dfb4572fd Apr 16 19:55:03.412951 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.412918 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" event={"ID":"b3e122c2-a77f-466b-b593-7d6143d118cd","Type":"ContainerStarted","Data":"1f033014fba526d13c6505d977d57da4a001664db5b94c23e7ba9caef7ae83e8"} Apr 16 19:55:03.414224 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.414196 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9689c5b7d-7k6p7" event={"ID":"4775e95e-4763-42b5-ad41-20316a962a92","Type":"ContainerStarted","Data":"ad1b790d6586d2f6f388770a5b4e519190a1ecc84106226cfe5b2f9dfb4572fd"} Apr 16 19:55:03.452342 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.452294 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5549q" podStartSLOduration=1.190011467 podStartE2EDuration="2.452276249s" podCreationTimestamp="2026-04-16 19:55:01 +0000 UTC" firstStartedPulling="2026-04-16 19:55:02.00893518 +0000 UTC m=+63.492084076" lastFinishedPulling="2026-04-16 19:55:03.271199942 +0000 UTC m=+64.754348858" observedRunningTime="2026-04-16 19:55:03.451256309 +0000 UTC m=+64.934405230" watchObservedRunningTime="2026-04-16 19:55:03.452276249 +0000 UTC m=+64.935425169" Apr 16 19:55:03.457535 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.457504 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-bb845d988-9wk9x"] Apr 16 19:55:03.485932 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.485905 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-bb845d988-9wk9x"] Apr 16 19:55:03.486087 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.486044 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.489223 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.489198 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 19:55:03.489476 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.489458 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 19:55:03.489846 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.489815 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 19:55:03.489942 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.489922 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 19:55:03.490319 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.490299 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-a863mm8fmgt9u\"" Apr 16 19:55:03.490524 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.490502 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 19:55:03.490872 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.490852 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-sm9z4\"" Apr 16 19:55:03.504960 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.504900 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-grpc-tls\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.505104 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.505038 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.505104 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.505076 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.505231 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.505108 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-tls\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.505231 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.505139 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.505231 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.505170 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.505384 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.505243 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-metrics-client-ca\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.505384 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.505349 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c74mr\" (UniqueName: \"kubernetes.io/projected/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-kube-api-access-c74mr\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.606376 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.606290 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.606376 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.606346 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-tls\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.606604 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.606379 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.606604 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.606407 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.606722 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.606600 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-metrics-client-ca\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.606722 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.606682 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c74mr\" (UniqueName: \"kubernetes.io/projected/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-kube-api-access-c74mr\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.606722 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.606711 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-grpc-tls\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.606906 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.606791 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.609736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.609688 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-tls\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.609736 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.609693 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.610497 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.610412 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.610728 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.610689 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.611930 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.611906 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-grpc-tls\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.616425 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.616403 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c74mr\" (UniqueName: \"kubernetes.io/projected/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-kube-api-access-c74mr\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.618026 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.617985 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-metrics-client-ca\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.624810 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.624785 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/3f3e2227-a0ef-40fe-ba36-6816e80bce5c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-bb845d988-9wk9x\" (UID: \"3f3e2227-a0ef-40fe-ba36-6816e80bce5c\") " pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.797214 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.797190 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:03.970132 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:03.970104 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-bb845d988-9wk9x"] Apr 16 19:55:03.987904 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:55:03.987864 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f3e2227_a0ef_40fe_ba36_6816e80bce5c.slice/crio-43c2cfea622eb93702e81a233f201d11413839432090abb01b1b0dcd0b5f4123 WatchSource:0}: Error finding container 43c2cfea622eb93702e81a233f201d11413839432090abb01b1b0dcd0b5f4123: Status 404 returned error can't find the container with id 43c2cfea622eb93702e81a233f201d11413839432090abb01b1b0dcd0b5f4123 Apr 16 19:55:04.418794 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.418754 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" event={"ID":"3f3e2227-a0ef-40fe-ba36-6816e80bce5c","Type":"ContainerStarted","Data":"43c2cfea622eb93702e81a233f201d11413839432090abb01b1b0dcd0b5f4123"} Apr 16 19:55:04.420459 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.420431 2561 generic.go:358] "Generic (PLEG): container finished" podID="22ae2e7e-371a-42e0-a0a9-0a3dc249db95" containerID="a6599923ae9822e33dc1ddb1f0dec1ca00b500856a21fabe8e79e4ffd80586f9" exitCode=0 Apr 16 19:55:04.420599 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.420505 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-spv5g" event={"ID":"22ae2e7e-371a-42e0-a0a9-0a3dc249db95","Type":"ContainerDied","Data":"a6599923ae9822e33dc1ddb1f0dec1ca00b500856a21fabe8e79e4ffd80586f9"} Apr 16 19:55:04.818270 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.818177 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:55:04.820932 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.820906 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:55:04.831581 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.831555 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a0ce8326-ca5a-49b9-90c0-94db13c2c74e-metrics-certs\") pod \"network-metrics-daemon-hkqtk\" (UID: \"a0ce8326-ca5a-49b9-90c0-94db13c2c74e\") " pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:55:04.918703 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.918666 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:55:04.918910 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.918745 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:55:04.921266 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.921242 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:55:04.921380 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.921243 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:55:04.931877 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.931851 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/65540f57-d96d-4dcd-b7a1-010ff5e40cae-original-pull-secret\") pod \"global-pull-secret-syncer-dghfv\" (UID: \"65540f57-d96d-4dcd-b7a1-010ff5e40cae\") " pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:55:04.932197 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.932169 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:55:04.942510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.942475 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/3e96a428-4bd2-4a4f-b624-974f68f14131-kube-api-access-lw4wh\") pod \"network-check-target-jkgr6\" (UID: \"3e96a428-4bd2-4a4f-b624-974f68f14131\") " pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:55:04.962295 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.962266 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kmsjn\"" Apr 16 19:55:04.967470 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.967445 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t9ktw\"" Apr 16 19:55:04.971083 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.971026 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:55:04.976744 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.976721 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hkqtk" Apr 16 19:55:04.979605 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:04.979581 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dghfv" Apr 16 19:55:05.391954 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:05.391919 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pzd2h" Apr 16 19:55:06.267346 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.265706 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jkgr6"] Apr 16 19:55:06.272460 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:55:06.272428 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e96a428_4bd2_4a4f_b624_974f68f14131.slice/crio-8501686cd8ddcb7745ba7511d91b64692978c98032e5d916ca1fc7f6a8765386 WatchSource:0}: Error finding container 8501686cd8ddcb7745ba7511d91b64692978c98032e5d916ca1fc7f6a8765386: Status 404 returned error can't find the container with id 8501686cd8ddcb7745ba7511d91b64692978c98032e5d916ca1fc7f6a8765386 Apr 16 19:55:06.288897 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.288844 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hkqtk"] Apr 16 19:55:06.293088 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:55:06.293058 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0ce8326_ca5a_49b9_90c0_94db13c2c74e.slice/crio-519da467a655b6cd0f8fe8156087f97fa5ad624836398c9ee110a73c79b49e16 WatchSource:0}: Error finding container 519da467a655b6cd0f8fe8156087f97fa5ad624836398c9ee110a73c79b49e16: Status 404 returned error can't find the container with id 519da467a655b6cd0f8fe8156087f97fa5ad624836398c9ee110a73c79b49e16 Apr 16 19:55:06.427317 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.427275 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9689c5b7d-7k6p7" event={"ID":"4775e95e-4763-42b5-ad41-20316a962a92","Type":"ContainerStarted","Data":"694c53c9886ddaa8e39136d8e077359a042c5e4e4871f61298a987956a614e1a"} Apr 16 19:55:06.429060 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.429036 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" event={"ID":"3f3e2227-a0ef-40fe-ba36-6816e80bce5c","Type":"ContainerStarted","Data":"006dbf42559c8ac0e81adb3cc8d823828a2ff860a2079854164e896f18459787"} Apr 16 19:55:06.429181 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.429065 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" event={"ID":"3f3e2227-a0ef-40fe-ba36-6816e80bce5c","Type":"ContainerStarted","Data":"add4fc8ad85e479a7d6b56d15c4f21973bd3921845607aeb640163d05479ae09"} Apr 16 19:55:06.429181 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.429078 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" event={"ID":"3f3e2227-a0ef-40fe-ba36-6816e80bce5c","Type":"ContainerStarted","Data":"206b84ca6a05adfa832e7666582770ded7dfdb9b834584462e42b8b28a1abb5a"} Apr 16 19:55:06.430895 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.430871 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-spv5g" event={"ID":"22ae2e7e-371a-42e0-a0a9-0a3dc249db95","Type":"ContainerStarted","Data":"8af4bf522e9c8a1f09a86b582ab490bb3f25944e0f88b6ae45c18f4367bef718"} Apr 16 19:55:06.430966 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.430903 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-spv5g" event={"ID":"22ae2e7e-371a-42e0-a0a9-0a3dc249db95","Type":"ContainerStarted","Data":"6a2e4be0c8dcb56c93e48d48995b271a6246fa0e80ba221b6209f9a917879fb2"} Apr 16 19:55:06.431876 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.431852 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hkqtk" event={"ID":"a0ce8326-ca5a-49b9-90c0-94db13c2c74e","Type":"ContainerStarted","Data":"519da467a655b6cd0f8fe8156087f97fa5ad624836398c9ee110a73c79b49e16"} Apr 16 19:55:06.432755 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.432728 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jkgr6" event={"ID":"3e96a428-4bd2-4a4f-b624-974f68f14131","Type":"ContainerStarted","Data":"8501686cd8ddcb7745ba7511d91b64692978c98032e5d916ca1fc7f6a8765386"} Apr 16 19:55:06.445402 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.445364 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9689c5b7d-7k6p7" podStartSLOduration=1.792091221 podStartE2EDuration="5.445353881s" podCreationTimestamp="2026-04-16 19:55:01 +0000 UTC" firstStartedPulling="2026-04-16 19:55:02.457772275 +0000 UTC m=+63.940921171" lastFinishedPulling="2026-04-16 19:55:06.111034919 +0000 UTC m=+67.594183831" observedRunningTime="2026-04-16 19:55:06.44462239 +0000 UTC m=+67.927771308" watchObservedRunningTime="2026-04-16 19:55:06.445353881 +0000 UTC m=+67.928502800" Apr 16 19:55:06.467177 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.467131 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-spv5g" podStartSLOduration=3.963133056 podStartE2EDuration="5.467117768s" podCreationTimestamp="2026-04-16 19:55:01 +0000 UTC" firstStartedPulling="2026-04-16 19:55:02.283466694 +0000 UTC m=+63.766615593" lastFinishedPulling="2026-04-16 19:55:03.787451403 +0000 UTC m=+65.270600305" observedRunningTime="2026-04-16 19:55:06.465648717 +0000 UTC m=+67.948797635" watchObservedRunningTime="2026-04-16 19:55:06.467117768 +0000 UTC m=+67.950266692" Apr 16 19:55:06.487181 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:06.487153 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dghfv"] Apr 16 19:55:06.489999 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:55:06.489975 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65540f57_d96d_4dcd_b7a1_010ff5e40cae.slice/crio-5837f3bde9265f5c15422f1e6376bd2fa21fddd564de5861ac5741fa5e668b17 WatchSource:0}: Error finding container 5837f3bde9265f5c15422f1e6376bd2fa21fddd564de5861ac5741fa5e668b17: Status 404 returned error can't find the container with id 5837f3bde9265f5c15422f1e6376bd2fa21fddd564de5861ac5741fa5e668b17 Apr 16 19:55:07.437039 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.436993 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dghfv" event={"ID":"65540f57-d96d-4dcd-b7a1-010ff5e40cae","Type":"ContainerStarted","Data":"5837f3bde9265f5c15422f1e6376bd2fa21fddd564de5861ac5741fa5e668b17"} Apr 16 19:55:07.586437 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.585216 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:07.590513 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.590485 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.593122 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.593057 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:55:07.596069 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.595889 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:55:07.600669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.599345 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jkjtr\"" Apr 16 19:55:07.600669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.599592 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:55:07.600669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.599822 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:55:07.600669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.599956 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:55:07.600669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.600059 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:55:07.600669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.600210 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:55:07.600669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.600411 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:55:07.600669 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.600416 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:55:07.602962 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.601289 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:55:07.602962 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.601767 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:55:07.602962 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.602787 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:55:07.603213 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.603091 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3nrmrj703e4ke\"" Apr 16 19:55:07.607444 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.606875 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:55:07.642025 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.641679 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642025 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.641734 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642025 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.641781 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642025 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.641811 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config-out\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642025 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.641851 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-web-config\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642025 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.641882 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642025 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.641923 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm765\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-kube-api-access-qm765\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642025 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.641982 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642035 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642099 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642130 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642166 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642212 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642260 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642285 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642310 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642337 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.642510 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.642368 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.653120 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.653076 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:07.742857 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.742791 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743012 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743071 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743100 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config-out\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743123 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-web-config\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743157 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743195 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qm765\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-kube-api-access-qm765\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743232 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743258 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743290 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743319 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743346 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743377 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.743424 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743429 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.744157 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743459 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.744157 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743482 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.744157 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743506 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.744157 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.743539 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.746143 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.745211 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.746143 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.745841 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.747330 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.747305 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.749189 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.747451 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.749189 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.748109 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.749189 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.749123 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.752428 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.751141 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.752428 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.751180 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.752428 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.751352 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.752428 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.751638 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.752428 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.751710 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.752428 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.751942 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.752428 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.752106 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.752428 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.752305 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.753568 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.753393 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.753931 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.753868 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-web-config\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.754681 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.754635 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config-out\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.756192 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.756168 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm765\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-kube-api-access-qm765\") pod \"prometheus-k8s-0\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:07.906776 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:07.906738 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:08.087409 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.087195 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:55:08.090366 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:55:08.090337 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3aa8ce4_8886_41a4_9744_0fb16fa33fc2.slice/crio-2f564bb3184ab46c198e7a70c4ae900303dd9dea301a500cd5d9881924a1cae9 WatchSource:0}: Error finding container 2f564bb3184ab46c198e7a70c4ae900303dd9dea301a500cd5d9881924a1cae9: Status 404 returned error can't find the container with id 2f564bb3184ab46c198e7a70c4ae900303dd9dea301a500cd5d9881924a1cae9 Apr 16 19:55:08.446951 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.446755 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hkqtk" event={"ID":"a0ce8326-ca5a-49b9-90c0-94db13c2c74e","Type":"ContainerStarted","Data":"a99d5096ffb852023a1039d4c4c0bae540339d184178f6defbca8a99bf9791f4"} Apr 16 19:55:08.446951 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.446856 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hkqtk" event={"ID":"a0ce8326-ca5a-49b9-90c0-94db13c2c74e","Type":"ContainerStarted","Data":"eb7941661aa9454a3b0571eb5a4a538d03aea2775d6099022db07dd25bcc30e2"} Apr 16 19:55:08.448972 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.448940 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerStarted","Data":"2f564bb3184ab46c198e7a70c4ae900303dd9dea301a500cd5d9881924a1cae9"} Apr 16 19:55:08.452221 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.452187 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" event={"ID":"3f3e2227-a0ef-40fe-ba36-6816e80bce5c","Type":"ContainerStarted","Data":"4654c54b7fb146fd80dc0636a8caf928afae3d00ef51a4440982befe0291e2c4"} Apr 16 19:55:08.452221 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.452222 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" event={"ID":"3f3e2227-a0ef-40fe-ba36-6816e80bce5c","Type":"ContainerStarted","Data":"ec97383fd2f34161837848c971a6126ddb7491826963de12eef0408eb8748762"} Apr 16 19:55:08.452408 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.452237 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" event={"ID":"3f3e2227-a0ef-40fe-ba36-6816e80bce5c","Type":"ContainerStarted","Data":"c8fd1ab74e9b6342c059b80776d43e3b9600c3e03a3fea0aaa4bd8b75ee56f15"} Apr 16 19:55:08.452627 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.452609 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:08.468096 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.468046 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hkqtk" podStartSLOduration=68.306703233 podStartE2EDuration="1m9.468030359s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:55:06.295123911 +0000 UTC m=+67.778272808" lastFinishedPulling="2026-04-16 19:55:07.456451031 +0000 UTC m=+68.939599934" observedRunningTime="2026-04-16 19:55:08.466191848 +0000 UTC m=+69.949340768" watchObservedRunningTime="2026-04-16 19:55:08.468030359 +0000 UTC m=+69.951179280" Apr 16 19:55:08.509690 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:08.509254 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" podStartSLOduration=2.088349654 podStartE2EDuration="5.509233114s" podCreationTimestamp="2026-04-16 19:55:03 +0000 UTC" firstStartedPulling="2026-04-16 19:55:03.990309813 +0000 UTC m=+65.473458714" lastFinishedPulling="2026-04-16 19:55:07.411193262 +0000 UTC m=+68.894342174" observedRunningTime="2026-04-16 19:55:08.50671337 +0000 UTC m=+69.989862291" watchObservedRunningTime="2026-04-16 19:55:08.509233114 +0000 UTC m=+69.992382034" Apr 16 19:55:11.098816 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:11.098773 2561 patch_prober.go:28] interesting pod/image-registry-68b985c7d7-k2vd9 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 19:55:11.099224 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:11.098854 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" podUID="f8f97a0d-d529-41d9-b1dd-4aa61965968d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:55:11.463462 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:11.463433 2561 generic.go:358] "Generic (PLEG): container finished" podID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerID="d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455" exitCode=0 Apr 16 19:55:11.463662 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:11.463512 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerDied","Data":"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455"} Apr 16 19:55:11.464995 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:11.464959 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jkgr6" event={"ID":"3e96a428-4bd2-4a4f-b624-974f68f14131","Type":"ContainerStarted","Data":"2de4e2d238ed2b4f3740f622f95980b4c42295c09104bceed2fe823e363ad69a"} Apr 16 19:55:11.465175 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:11.465144 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:55:11.466414 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:11.466382 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dghfv" event={"ID":"65540f57-d96d-4dcd-b7a1-010ff5e40cae","Type":"ContainerStarted","Data":"92fdd4468fba36645a0363748c99cd2aa830b89386db063b048b08895a497a0b"} Apr 16 19:55:11.507328 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:11.507281 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jkgr6" podStartSLOduration=67.575027591 podStartE2EDuration="1m12.507261807s" podCreationTimestamp="2026-04-16 19:53:59 +0000 UTC" firstStartedPulling="2026-04-16 19:55:06.27487783 +0000 UTC m=+67.758026727" lastFinishedPulling="2026-04-16 19:55:11.207112031 +0000 UTC m=+72.690260943" observedRunningTime="2026-04-16 19:55:11.506726191 +0000 UTC m=+72.989875111" watchObservedRunningTime="2026-04-16 19:55:11.507261807 +0000 UTC m=+72.990410728" Apr 16 19:55:11.579255 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:55:11.578919 2561 configmap.go:193] Couldn't get configMap openshift-monitoring/prometheus-k8s-rulefiles-0: configmap "prometheus-k8s-rulefiles-0" not found Apr 16 19:55:11.579255 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:55:11.579022 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-rulefiles-0 podName:b3aa8ce4-8886-41a4-9744-0fb16fa33fc2 nodeName:}" failed. No retries permitted until 2026-04-16 19:55:12.078998328 +0000 UTC m=+73.562147246 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-k8s-rulefiles-0" (UniqueName: "kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-rulefiles-0") pod "prometheus-k8s-0" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2") : configmap "prometheus-k8s-rulefiles-0" not found Apr 16 19:55:12.336413 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:12.336379 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:12.336877 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:12.336538 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:12.341913 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:12.341888 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:12.362076 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:12.362002 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dghfv" podStartSLOduration=66.649725445 podStartE2EDuration="1m11.361986122s" podCreationTimestamp="2026-04-16 19:54:01 +0000 UTC" firstStartedPulling="2026-04-16 19:55:06.491646305 +0000 UTC m=+67.974795205" lastFinishedPulling="2026-04-16 19:55:11.203906985 +0000 UTC m=+72.687055882" observedRunningTime="2026-04-16 19:55:11.525349687 +0000 UTC m=+73.008498606" watchObservedRunningTime="2026-04-16 19:55:12.361986122 +0000 UTC m=+73.845135044" Apr 16 19:55:12.474628 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:12.474556 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:14.386422 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:14.386354 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-68b985c7d7-k2vd9" Apr 16 19:55:14.462596 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:14.462573 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-bb845d988-9wk9x" Apr 16 19:55:14.479798 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:14.479767 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerStarted","Data":"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380"} Apr 16 19:55:14.479798 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:14.479801 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerStarted","Data":"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c"} Apr 16 19:55:14.479992 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:14.479810 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerStarted","Data":"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43"} Apr 16 19:55:14.479992 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:14.479819 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerStarted","Data":"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6"} Apr 16 19:55:14.479992 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:14.479849 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerStarted","Data":"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d"} Apr 16 19:55:14.479992 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:14.479860 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerStarted","Data":"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003"} Apr 16 19:55:14.548573 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:14.548521 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.775775053 podStartE2EDuration="7.54850669s" podCreationTimestamp="2026-04-16 19:55:07 +0000 UTC" firstStartedPulling="2026-04-16 19:55:08.092624141 +0000 UTC m=+69.575773038" lastFinishedPulling="2026-04-16 19:55:13.865355777 +0000 UTC m=+75.348504675" observedRunningTime="2026-04-16 19:55:14.546383123 +0000 UTC m=+76.029532041" watchObservedRunningTime="2026-04-16 19:55:14.54850669 +0000 UTC m=+76.031655608" Apr 16 19:55:17.907155 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:17.907033 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:55:21.329175 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:21.329135 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9689c5b7d-7k6p7"] Apr 16 19:55:37.044443 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:37.044400 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-htqsk_3d9dabfc-3f32-488d-81d1-0da63416b50c/serve-healthcheck-canary/0.log" Apr 16 19:55:42.472691 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:42.472661 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jkgr6" Apr 16 19:55:46.348272 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.348200 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9689c5b7d-7k6p7" podUID="4775e95e-4763-42b5-ad41-20316a962a92" containerName="console" containerID="cri-o://694c53c9886ddaa8e39136d8e077359a042c5e4e4871f61298a987956a614e1a" gracePeriod=15 Apr 16 19:55:46.569431 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.569409 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9689c5b7d-7k6p7_4775e95e-4763-42b5-ad41-20316a962a92/console/0.log" Apr 16 19:55:46.569553 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.569444 2561 generic.go:358] "Generic (PLEG): container finished" podID="4775e95e-4763-42b5-ad41-20316a962a92" containerID="694c53c9886ddaa8e39136d8e077359a042c5e4e4871f61298a987956a614e1a" exitCode=2 Apr 16 19:55:46.569553 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.569476 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9689c5b7d-7k6p7" event={"ID":"4775e95e-4763-42b5-ad41-20316a962a92","Type":"ContainerDied","Data":"694c53c9886ddaa8e39136d8e077359a042c5e4e4871f61298a987956a614e1a"} Apr 16 19:55:46.581238 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.581216 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9689c5b7d-7k6p7_4775e95e-4763-42b5-ad41-20316a962a92/console/0.log" Apr 16 19:55:46.581335 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.581284 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:46.677891 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.677858 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-service-ca\") pod \"4775e95e-4763-42b5-ad41-20316a962a92\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " Apr 16 19:55:46.678060 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.677909 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-serving-cert\") pod \"4775e95e-4763-42b5-ad41-20316a962a92\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " Apr 16 19:55:46.678060 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.677942 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72z4\" (UniqueName: \"kubernetes.io/projected/4775e95e-4763-42b5-ad41-20316a962a92-kube-api-access-f72z4\") pod \"4775e95e-4763-42b5-ad41-20316a962a92\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " Apr 16 19:55:46.678060 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.677986 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-oauth-serving-cert\") pod \"4775e95e-4763-42b5-ad41-20316a962a92\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " Apr 16 19:55:46.678272 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.678161 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-console-config\") pod \"4775e95e-4763-42b5-ad41-20316a962a92\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " Apr 16 19:55:46.678272 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.678208 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-oauth-config\") pod \"4775e95e-4763-42b5-ad41-20316a962a92\" (UID: \"4775e95e-4763-42b5-ad41-20316a962a92\") " Apr 16 19:55:46.678391 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.678331 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-service-ca" (OuterVolumeSpecName: "service-ca") pod "4775e95e-4763-42b5-ad41-20316a962a92" (UID: "4775e95e-4763-42b5-ad41-20316a962a92"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:46.678444 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.678396 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4775e95e-4763-42b5-ad41-20316a962a92" (UID: "4775e95e-4763-42b5-ad41-20316a962a92"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:46.678495 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.678460 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-console-config" (OuterVolumeSpecName: "console-config") pod "4775e95e-4763-42b5-ad41-20316a962a92" (UID: "4775e95e-4763-42b5-ad41-20316a962a92"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:55:46.678495 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.678479 2561 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-service-ca\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:55:46.678568 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.678498 2561 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-oauth-serving-cert\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:55:46.680309 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.680289 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4775e95e-4763-42b5-ad41-20316a962a92" (UID: "4775e95e-4763-42b5-ad41-20316a962a92"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:46.680403 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.680323 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4775e95e-4763-42b5-ad41-20316a962a92-kube-api-access-f72z4" (OuterVolumeSpecName: "kube-api-access-f72z4") pod "4775e95e-4763-42b5-ad41-20316a962a92" (UID: "4775e95e-4763-42b5-ad41-20316a962a92"). InnerVolumeSpecName "kube-api-access-f72z4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:55:46.680403 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.680358 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4775e95e-4763-42b5-ad41-20316a962a92" (UID: "4775e95e-4763-42b5-ad41-20316a962a92"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:55:46.779407 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.779369 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f72z4\" (UniqueName: \"kubernetes.io/projected/4775e95e-4763-42b5-ad41-20316a962a92-kube-api-access-f72z4\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:55:46.779407 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.779400 2561 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4775e95e-4763-42b5-ad41-20316a962a92-console-config\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:55:46.779407 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.779410 2561 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-oauth-config\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:55:46.779616 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:46.779419 2561 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4775e95e-4763-42b5-ad41-20316a962a92-console-serving-cert\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:55:47.573628 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:47.573550 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9689c5b7d-7k6p7_4775e95e-4763-42b5-ad41-20316a962a92/console/0.log" Apr 16 19:55:47.574067 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:47.573671 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9689c5b7d-7k6p7" Apr 16 19:55:47.574067 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:47.573668 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9689c5b7d-7k6p7" event={"ID":"4775e95e-4763-42b5-ad41-20316a962a92","Type":"ContainerDied","Data":"ad1b790d6586d2f6f388770a5b4e519190a1ecc84106226cfe5b2f9dfb4572fd"} Apr 16 19:55:47.574067 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:47.573794 2561 scope.go:117] "RemoveContainer" containerID="694c53c9886ddaa8e39136d8e077359a042c5e4e4871f61298a987956a614e1a" Apr 16 19:55:47.593464 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:47.593430 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9689c5b7d-7k6p7"] Apr 16 19:55:47.598456 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:47.598425 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9689c5b7d-7k6p7"] Apr 16 19:55:49.158946 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:55:49.158897 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4775e95e-4763-42b5-ad41-20316a962a92" path="/var/lib/kubelet/pods/4775e95e-4763-42b5-ad41-20316a962a92/volumes" Apr 16 19:56:07.907889 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:07.907847 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:07.928388 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:07.928354 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:08.649007 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:08.648983 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:25.910132 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:25.910096 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:25.910764 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:25.910698 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy" containerID="cri-o://2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" gracePeriod=600 Apr 16 19:56:25.910932 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:25.910746 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy-web" containerID="cri-o://f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" gracePeriod=600 Apr 16 19:56:25.910932 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:25.910739 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="thanos-sidecar" containerID="cri-o://5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" gracePeriod=600 Apr 16 19:56:25.911024 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:25.910760 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="config-reloader" containerID="cri-o://652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" gracePeriod=600 Apr 16 19:56:25.911024 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:25.910756 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy-thanos" containerID="cri-o://75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" gracePeriod=600 Apr 16 19:56:25.911024 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:25.910682 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="prometheus" containerID="cri-o://c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" gracePeriod=600 Apr 16 19:56:26.141133 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.141108 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.279024 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.278935 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-web-config\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279024 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.278974 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-tls-assets\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279024 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279019 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279046 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-db\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279065 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-grpc-tls\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279082 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279100 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-kubelet-serving-ca-bundle\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279123 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-tls\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279151 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-kube-rbac-proxy\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279175 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config-out\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279201 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-serving-certs-ca-bundle\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279268 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-metrics-client-ca\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279298 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279293 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279793 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279323 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm765\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-kube-api-access-qm765\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279793 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279372 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-rulefiles-0\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279793 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279400 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-thanos-prometheus-http-client-file\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279793 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279427 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-trusted-ca-bundle\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.279793 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.279457 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-metrics-client-certs\") pod \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\" (UID: \"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2\") " Apr 16 19:56:26.280242 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.280171 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:26.281928 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.281887 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:26.282052 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.282000 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:26.282472 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.282444 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:26.282803 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.282756 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:26.283131 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.283075 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:26.283604 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.283327 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:26.283604 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.283349 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:56:26.283604 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.283463 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:26.283810 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.283756 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:26.283810 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.283764 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config-out" (OuterVolumeSpecName: "config-out") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:56:26.283957 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.283882 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:26.284175 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.284037 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:26.284175 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.284137 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-kube-api-access-qm765" (OuterVolumeSpecName: "kube-api-access-qm765") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "kube-api-access-qm765". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:56:26.284431 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.284409 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:26.284680 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.284662 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config" (OuterVolumeSpecName: "config") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:26.285368 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.285344 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:26.292191 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.292167 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-web-config" (OuterVolumeSpecName: "web-config") pod "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" (UID: "b3aa8ce4-8886-41a4-9744-0fb16fa33fc2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:56:26.380903 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.380868 2561 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.380903 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.380899 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qm765\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-kube-api-access-qm765\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.380913 2561 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.380928 2561 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-thanos-prometheus-http-client-file\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.380943 2561 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-trusted-ca-bundle\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.380955 2561 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-metrics-client-certs\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.380966 2561 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-web-config\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.380978 2561 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-tls-assets\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.380991 2561 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.381003 2561 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-prometheus-k8s-db\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.381014 2561 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-grpc-tls\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.381026 2561 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.381039 2561 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.381053 2561 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-prometheus-k8s-tls\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.381067 2561 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-secret-kube-rbac-proxy\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.381081 2561 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-config-out\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.381094 2561 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.381147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.381109 2561 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2-configmap-metrics-client-ca\") on node \"ip-10-0-130-164.ec2.internal\" DevicePath \"\"" Apr 16 19:56:26.685638 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685610 2561 generic.go:358] "Generic (PLEG): container finished" podID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerID="75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" exitCode=0 Apr 16 19:56:26.685638 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685634 2561 generic.go:358] "Generic (PLEG): container finished" podID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerID="2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" exitCode=0 Apr 16 19:56:26.685638 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685644 2561 generic.go:358] "Generic (PLEG): container finished" podID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerID="f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" exitCode=0 Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685652 2561 generic.go:358] "Generic (PLEG): container finished" podID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerID="5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" exitCode=0 Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685660 2561 generic.go:358] "Generic (PLEG): container finished" podID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerID="652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" exitCode=0 Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685667 2561 generic.go:358] "Generic (PLEG): container finished" podID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerID="c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" exitCode=0 Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685697 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerDied","Data":"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380"} Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685741 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerDied","Data":"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c"} Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685755 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerDied","Data":"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43"} Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685765 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerDied","Data":"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6"} Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685774 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerDied","Data":"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d"} Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685783 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerDied","Data":"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003"} Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685710 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685795 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b3aa8ce4-8886-41a4-9744-0fb16fa33fc2","Type":"ContainerDied","Data":"2f564bb3184ab46c198e7a70c4ae900303dd9dea301a500cd5d9881924a1cae9"} Apr 16 19:56:26.685881 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.685815 2561 scope.go:117] "RemoveContainer" containerID="75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" Apr 16 19:56:26.692919 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.692901 2561 scope.go:117] "RemoveContainer" containerID="2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" Apr 16 19:56:26.699177 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.699160 2561 scope.go:117] "RemoveContainer" containerID="f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" Apr 16 19:56:26.705515 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.705493 2561 scope.go:117] "RemoveContainer" containerID="5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" Apr 16 19:56:26.707188 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.707165 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:26.712006 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.711990 2561 scope.go:117] "RemoveContainer" containerID="652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" Apr 16 19:56:26.713480 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.713462 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:26.718435 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.718420 2561 scope.go:117] "RemoveContainer" containerID="c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" Apr 16 19:56:26.724630 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.724608 2561 scope.go:117] "RemoveContainer" containerID="d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455" Apr 16 19:56:26.730538 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.730523 2561 scope.go:117] "RemoveContainer" containerID="75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" Apr 16 19:56:26.730797 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:56:26.730780 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": container with ID starting with 75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380 not found: ID does not exist" containerID="75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" Apr 16 19:56:26.730894 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.730811 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380"} err="failed to get container status \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": rpc error: code = NotFound desc = could not find container \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": container with ID starting with 75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380 not found: ID does not exist" Apr 16 19:56:26.730894 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.730877 2561 scope.go:117] "RemoveContainer" containerID="2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" Apr 16 19:56:26.731150 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:56:26.731133 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": container with ID starting with 2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c not found: ID does not exist" containerID="2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" Apr 16 19:56:26.731192 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.731169 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c"} err="failed to get container status \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": rpc error: code = NotFound desc = could not find container \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": container with ID starting with 2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c not found: ID does not exist" Apr 16 19:56:26.731192 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.731186 2561 scope.go:117] "RemoveContainer" containerID="f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" Apr 16 19:56:26.731410 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:56:26.731394 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": container with ID starting with f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43 not found: ID does not exist" containerID="f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" Apr 16 19:56:26.731454 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.731415 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43"} err="failed to get container status \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": rpc error: code = NotFound desc = could not find container \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": container with ID starting with f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43 not found: ID does not exist" Apr 16 19:56:26.731454 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.731432 2561 scope.go:117] "RemoveContainer" containerID="5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" Apr 16 19:56:26.731651 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:56:26.731635 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": container with ID starting with 5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6 not found: ID does not exist" containerID="5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" Apr 16 19:56:26.731712 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.731657 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6"} err="failed to get container status \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": rpc error: code = NotFound desc = could not find container \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": container with ID starting with 5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6 not found: ID does not exist" Apr 16 19:56:26.731712 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.731680 2561 scope.go:117] "RemoveContainer" containerID="652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" Apr 16 19:56:26.731916 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:56:26.731894 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": container with ID starting with 652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d not found: ID does not exist" containerID="652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" Apr 16 19:56:26.731962 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.731921 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d"} err="failed to get container status \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": rpc error: code = NotFound desc = could not find container \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": container with ID starting with 652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d not found: ID does not exist" Apr 16 19:56:26.731962 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.731940 2561 scope.go:117] "RemoveContainer" containerID="c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" Apr 16 19:56:26.732161 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:56:26.732141 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": container with ID starting with c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003 not found: ID does not exist" containerID="c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" Apr 16 19:56:26.732198 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.732164 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003"} err="failed to get container status \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": rpc error: code = NotFound desc = could not find container \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": container with ID starting with c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003 not found: ID does not exist" Apr 16 19:56:26.732198 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.732177 2561 scope.go:117] "RemoveContainer" containerID="d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455" Apr 16 19:56:26.732394 ip-10-0-130-164 kubenswrapper[2561]: E0416 19:56:26.732378 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": container with ID starting with d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455 not found: ID does not exist" containerID="d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455" Apr 16 19:56:26.732460 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.732402 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455"} err="failed to get container status \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": rpc error: code = NotFound desc = could not find container \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": container with ID starting with d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455 not found: ID does not exist" Apr 16 19:56:26.732460 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.732421 2561 scope.go:117] "RemoveContainer" containerID="75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" Apr 16 19:56:26.732628 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.732613 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380"} err="failed to get container status \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": rpc error: code = NotFound desc = could not find container \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": container with ID starting with 75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380 not found: ID does not exist" Apr 16 19:56:26.732684 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.732629 2561 scope.go:117] "RemoveContainer" containerID="2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" Apr 16 19:56:26.732866 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.732847 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c"} err="failed to get container status \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": rpc error: code = NotFound desc = could not find container \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": container with ID starting with 2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c not found: ID does not exist" Apr 16 19:56:26.732914 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.732868 2561 scope.go:117] "RemoveContainer" containerID="f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" Apr 16 19:56:26.733071 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733055 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43"} err="failed to get container status \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": rpc error: code = NotFound desc = could not find container \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": container with ID starting with f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43 not found: ID does not exist" Apr 16 19:56:26.733112 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733071 2561 scope.go:117] "RemoveContainer" containerID="5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" Apr 16 19:56:26.733302 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733282 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6"} err="failed to get container status \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": rpc error: code = NotFound desc = could not find container \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": container with ID starting with 5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6 not found: ID does not exist" Apr 16 19:56:26.733365 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733303 2561 scope.go:117] "RemoveContainer" containerID="652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" Apr 16 19:56:26.733527 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733512 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d"} err="failed to get container status \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": rpc error: code = NotFound desc = could not find container \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": container with ID starting with 652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d not found: ID does not exist" Apr 16 19:56:26.733572 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733527 2561 scope.go:117] "RemoveContainer" containerID="c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" Apr 16 19:56:26.733726 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733707 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003"} err="failed to get container status \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": rpc error: code = NotFound desc = could not find container \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": container with ID starting with c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003 not found: ID does not exist" Apr 16 19:56:26.733792 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733727 2561 scope.go:117] "RemoveContainer" containerID="d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455" Apr 16 19:56:26.733962 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733944 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455"} err="failed to get container status \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": rpc error: code = NotFound desc = could not find container \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": container with ID starting with d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455 not found: ID does not exist" Apr 16 19:56:26.734012 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.733963 2561 scope.go:117] "RemoveContainer" containerID="75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" Apr 16 19:56:26.734147 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734130 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380"} err="failed to get container status \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": rpc error: code = NotFound desc = could not find container \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": container with ID starting with 75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380 not found: ID does not exist" Apr 16 19:56:26.734213 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734149 2561 scope.go:117] "RemoveContainer" containerID="2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" Apr 16 19:56:26.734377 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734358 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c"} err="failed to get container status \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": rpc error: code = NotFound desc = could not find container \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": container with ID starting with 2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c not found: ID does not exist" Apr 16 19:56:26.734421 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734379 2561 scope.go:117] "RemoveContainer" containerID="f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" Apr 16 19:56:26.734568 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734551 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43"} err="failed to get container status \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": rpc error: code = NotFound desc = could not find container \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": container with ID starting with f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43 not found: ID does not exist" Apr 16 19:56:26.734633 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734569 2561 scope.go:117] "RemoveContainer" containerID="5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" Apr 16 19:56:26.734755 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734738 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6"} err="failed to get container status \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": rpc error: code = NotFound desc = could not find container \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": container with ID starting with 5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6 not found: ID does not exist" Apr 16 19:56:26.734799 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734755 2561 scope.go:117] "RemoveContainer" containerID="652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" Apr 16 19:56:26.734943 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734921 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d"} err="failed to get container status \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": rpc error: code = NotFound desc = could not find container \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": container with ID starting with 652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d not found: ID does not exist" Apr 16 19:56:26.735016 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.734943 2561 scope.go:117] "RemoveContainer" containerID="c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" Apr 16 19:56:26.735184 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.735167 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003"} err="failed to get container status \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": rpc error: code = NotFound desc = could not find container \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": container with ID starting with c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003 not found: ID does not exist" Apr 16 19:56:26.735250 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.735186 2561 scope.go:117] "RemoveContainer" containerID="d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455" Apr 16 19:56:26.735442 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.735424 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455"} err="failed to get container status \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": rpc error: code = NotFound desc = could not find container \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": container with ID starting with d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455 not found: ID does not exist" Apr 16 19:56:26.735496 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.735442 2561 scope.go:117] "RemoveContainer" containerID="75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" Apr 16 19:56:26.735666 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.735644 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380"} err="failed to get container status \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": rpc error: code = NotFound desc = could not find container \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": container with ID starting with 75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380 not found: ID does not exist" Apr 16 19:56:26.735722 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.735667 2561 scope.go:117] "RemoveContainer" containerID="2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" Apr 16 19:56:26.735973 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.735947 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c"} err="failed to get container status \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": rpc error: code = NotFound desc = could not find container \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": container with ID starting with 2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c not found: ID does not exist" Apr 16 19:56:26.735973 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.735972 2561 scope.go:117] "RemoveContainer" containerID="f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" Apr 16 19:56:26.736381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.736354 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43"} err="failed to get container status \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": rpc error: code = NotFound desc = could not find container \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": container with ID starting with f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43 not found: ID does not exist" Apr 16 19:56:26.736381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.736380 2561 scope.go:117] "RemoveContainer" containerID="5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" Apr 16 19:56:26.736657 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.736624 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6"} err="failed to get container status \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": rpc error: code = NotFound desc = could not find container \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": container with ID starting with 5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6 not found: ID does not exist" Apr 16 19:56:26.736730 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.736657 2561 scope.go:117] "RemoveContainer" containerID="652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" Apr 16 19:56:26.736920 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.736892 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d"} err="failed to get container status \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": rpc error: code = NotFound desc = could not find container \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": container with ID starting with 652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d not found: ID does not exist" Apr 16 19:56:26.736920 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.736915 2561 scope.go:117] "RemoveContainer" containerID="c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" Apr 16 19:56:26.737140 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737120 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003"} err="failed to get container status \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": rpc error: code = NotFound desc = could not find container \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": container with ID starting with c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003 not found: ID does not exist" Apr 16 19:56:26.737140 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737140 2561 scope.go:117] "RemoveContainer" containerID="d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455" Apr 16 19:56:26.737357 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737334 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455"} err="failed to get container status \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": rpc error: code = NotFound desc = could not find container \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": container with ID starting with d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455 not found: ID does not exist" Apr 16 19:56:26.737418 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737358 2561 scope.go:117] "RemoveContainer" containerID="75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" Apr 16 19:56:26.737418 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737378 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:26.737596 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737574 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380"} err="failed to get container status \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": rpc error: code = NotFound desc = could not find container \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": container with ID starting with 75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380 not found: ID does not exist" Apr 16 19:56:26.737596 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737596 2561 scope.go:117] "RemoveContainer" containerID="2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" Apr 16 19:56:26.737705 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737654 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="thanos-sidecar" Apr 16 19:56:26.737705 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737669 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="thanos-sidecar" Apr 16 19:56:26.737705 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737684 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="init-config-reloader" Apr 16 19:56:26.737705 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737693 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="init-config-reloader" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737707 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy-thanos" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737715 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy-thanos" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737727 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="config-reloader" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737735 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="config-reloader" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737748 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="prometheus" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737756 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="prometheus" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737776 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy-web" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737784 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy-web" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737792 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4775e95e-4763-42b5-ad41-20316a962a92" containerName="console" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737797 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="4775e95e-4763-42b5-ad41-20316a962a92" containerName="console" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737804 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737809 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737812 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c"} err="failed to get container status \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": rpc error: code = NotFound desc = could not find container \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": container with ID starting with 2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c not found: ID does not exist" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737847 2561 scope.go:117] "RemoveContainer" containerID="f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" Apr 16 19:56:26.737867 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737873 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy-web" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737882 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="config-reloader" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737888 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="4775e95e-4763-42b5-ad41-20316a962a92" containerName="console" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737896 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737902 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="kube-rbac-proxy-thanos" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737908 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="prometheus" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.737915 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" containerName="thanos-sidecar" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.738071 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43"} err="failed to get container status \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": rpc error: code = NotFound desc = could not find container \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": container with ID starting with f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43 not found: ID does not exist" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.738093 2561 scope.go:117] "RemoveContainer" containerID="5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.738293 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6"} err="failed to get container status \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": rpc error: code = NotFound desc = could not find container \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": container with ID starting with 5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6 not found: ID does not exist" Apr 16 19:56:26.738381 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.738307 2561 scope.go:117] "RemoveContainer" containerID="652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" Apr 16 19:56:26.738693 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.738512 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d"} err="failed to get container status \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": rpc error: code = NotFound desc = could not find container \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": container with ID starting with 652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d not found: ID does not exist" Apr 16 19:56:26.738693 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.738525 2561 scope.go:117] "RemoveContainer" containerID="c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" Apr 16 19:56:26.738771 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.738712 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003"} err="failed to get container status \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": rpc error: code = NotFound desc = could not find container \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": container with ID starting with c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003 not found: ID does not exist" Apr 16 19:56:26.738771 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.738726 2561 scope.go:117] "RemoveContainer" containerID="d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455" Apr 16 19:56:26.739060 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739043 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455"} err="failed to get container status \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": rpc error: code = NotFound desc = could not find container \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": container with ID starting with d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455 not found: ID does not exist" Apr 16 19:56:26.739134 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739060 2561 scope.go:117] "RemoveContainer" containerID="75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380" Apr 16 19:56:26.739285 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739267 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380"} err="failed to get container status \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": rpc error: code = NotFound desc = could not find container \"75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380\": container with ID starting with 75355f0e3da55a8cff3f92c5af775c8dfe3f47d3bbd4959c8384603971134380 not found: ID does not exist" Apr 16 19:56:26.739348 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739286 2561 scope.go:117] "RemoveContainer" containerID="2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c" Apr 16 19:56:26.739489 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739472 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c"} err="failed to get container status \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": rpc error: code = NotFound desc = could not find container \"2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c\": container with ID starting with 2e198ab3ade11dc9849767f7b5b0faacbdd3afaf239c368b8d491afd0f9c6c7c not found: ID does not exist" Apr 16 19:56:26.739551 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739492 2561 scope.go:117] "RemoveContainer" containerID="f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43" Apr 16 19:56:26.739659 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739642 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43"} err="failed to get container status \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": rpc error: code = NotFound desc = could not find container \"f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43\": container with ID starting with f64ddb2182e2cd7b38debe2e7d9885e44875dca4b65548de3101cd921c55db43 not found: ID does not exist" Apr 16 19:56:26.739712 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739660 2561 scope.go:117] "RemoveContainer" containerID="5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6" Apr 16 19:56:26.739861 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739845 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6"} err="failed to get container status \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": rpc error: code = NotFound desc = could not find container \"5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6\": container with ID starting with 5586c873b3de0a62ec557085fd611da5262911554004f827edbabdfbd71bdbd6 not found: ID does not exist" Apr 16 19:56:26.739915 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.739861 2561 scope.go:117] "RemoveContainer" containerID="652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d" Apr 16 19:56:26.740030 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.740013 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d"} err="failed to get container status \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": rpc error: code = NotFound desc = could not find container \"652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d\": container with ID starting with 652c39820cfa0cf09a8205cfe7bb359fe3e47f7979babb79a4836c41e9b71b0d not found: ID does not exist" Apr 16 19:56:26.740094 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.740031 2561 scope.go:117] "RemoveContainer" containerID="c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003" Apr 16 19:56:26.740244 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.740211 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003"} err="failed to get container status \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": rpc error: code = NotFound desc = could not find container \"c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003\": container with ID starting with c73cd04687f73a09dfdab49b9e8ef65b3884d52b0ccf0170c7b79bd7d52eb003 not found: ID does not exist" Apr 16 19:56:26.740296 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.740246 2561 scope.go:117] "RemoveContainer" containerID="d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455" Apr 16 19:56:26.740446 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.740429 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455"} err="failed to get container status \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": rpc error: code = NotFound desc = could not find container \"d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455\": container with ID starting with d2353f3cb4e117e411bd930fad03e5dce712b1deb17faec03d50958e2734c455 not found: ID does not exist" Apr 16 19:56:26.743244 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.743230 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.745062 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745047 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:56:26.745165 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745141 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:56:26.745228 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745168 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:56:26.745228 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745176 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:56:26.745228 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745195 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:56:26.745389 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745268 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:56:26.745389 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745316 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:56:26.745492 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745475 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:56:26.745598 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745583 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:56:26.745665 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745649 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:56:26.745783 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745767 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jkjtr\"" Apr 16 19:56:26.745912 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745899 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:56:26.746007 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.745989 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3nrmrj703e4ke\"" Apr 16 19:56:26.750082 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.749952 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:56:26.751152 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.751135 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:56:26.753439 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.753417 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:26.884885 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.884823 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.884885 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.884886 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885079 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.884927 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-config\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885079 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.884953 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885079 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.884977 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885079 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885001 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trh4z\" (UniqueName: \"kubernetes.io/projected/1c9a3071-156d-4e4d-ac1a-8e92162c293e-kube-api-access-trh4z\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885079 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885015 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c9a3071-156d-4e4d-ac1a-8e92162c293e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885079 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885031 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885079 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885052 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885079 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885071 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885091 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885116 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885167 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c9a3071-156d-4e4d-ac1a-8e92162c293e-config-out\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885207 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885235 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885255 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-web-config\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885272 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1c9a3071-156d-4e4d-ac1a-8e92162c293e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.885358 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.885319 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986343 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986248 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c9a3071-156d-4e4d-ac1a-8e92162c293e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986343 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986299 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986343 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986325 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986357 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986387 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986415 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986440 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c9a3071-156d-4e4d-ac1a-8e92162c293e-config-out\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986473 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986504 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986647 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-web-config\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986690 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1c9a3071-156d-4e4d-ac1a-8e92162c293e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986733 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986770 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986811 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.986879 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986868 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-config\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.987466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986902 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.987466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986928 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.987466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.986964 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trh4z\" (UniqueName: \"kubernetes.io/projected/1c9a3071-156d-4e4d-ac1a-8e92162c293e-kube-api-access-trh4z\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.987466 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.987273 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.987674 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.987613 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.988248 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.988220 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.989252 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.989228 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1c9a3071-156d-4e4d-ac1a-8e92162c293e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.989693 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.989671 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.989789 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.989724 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.989886 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.989846 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.990013 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.989990 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c9a3071-156d-4e4d-ac1a-8e92162c293e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.990153 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.990132 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.990737 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.990662 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.991824 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.991791 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.993978 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.992029 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-config\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.993978 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.992210 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-web-config\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.993978 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.992339 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.993978 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.992450 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c9a3071-156d-4e4d-ac1a-8e92162c293e-config-out\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.993978 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.992507 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1c9a3071-156d-4e4d-ac1a-8e92162c293e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.994259 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.994088 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1c9a3071-156d-4e4d-ac1a-8e92162c293e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:26.998605 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:26.998586 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trh4z\" (UniqueName: \"kubernetes.io/projected/1c9a3071-156d-4e4d-ac1a-8e92162c293e-kube-api-access-trh4z\") pod \"prometheus-k8s-0\" (UID: \"1c9a3071-156d-4e4d-ac1a-8e92162c293e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:27.052975 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:27.052947 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:56:27.154237 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:27.154208 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3aa8ce4-8886-41a4-9744-0fb16fa33fc2" path="/var/lib/kubelet/pods/b3aa8ce4-8886-41a4-9744-0fb16fa33fc2/volumes" Apr 16 19:56:27.173509 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:27.173399 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:56:27.176296 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:56:27.176272 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c9a3071_156d_4e4d_ac1a_8e92162c293e.slice/crio-78ff1265c56d379de1e3e1bcdabaa9d7dfafc46aa1729c4d221261f981c2e5db WatchSource:0}: Error finding container 78ff1265c56d379de1e3e1bcdabaa9d7dfafc46aa1729c4d221261f981c2e5db: Status 404 returned error can't find the container with id 78ff1265c56d379de1e3e1bcdabaa9d7dfafc46aa1729c4d221261f981c2e5db Apr 16 19:56:27.689472 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:27.689443 2561 generic.go:358] "Generic (PLEG): container finished" podID="1c9a3071-156d-4e4d-ac1a-8e92162c293e" containerID="de4b4a94dce50b81764ae0c2f8f48ab16d39a3eacef943b21f0dcd4820c61597" exitCode=0 Apr 16 19:56:27.689638 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:27.689535 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1c9a3071-156d-4e4d-ac1a-8e92162c293e","Type":"ContainerDied","Data":"de4b4a94dce50b81764ae0c2f8f48ab16d39a3eacef943b21f0dcd4820c61597"} Apr 16 19:56:27.689638 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:27.689573 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1c9a3071-156d-4e4d-ac1a-8e92162c293e","Type":"ContainerStarted","Data":"78ff1265c56d379de1e3e1bcdabaa9d7dfafc46aa1729c4d221261f981c2e5db"} Apr 16 19:56:28.698254 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:28.698223 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1c9a3071-156d-4e4d-ac1a-8e92162c293e","Type":"ContainerStarted","Data":"d9be038fa80a4221c515e4c89c49d576d8333035172a32eef9be281f165ebb4d"} Apr 16 19:56:28.698254 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:28.698257 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1c9a3071-156d-4e4d-ac1a-8e92162c293e","Type":"ContainerStarted","Data":"7aa967babf8062b32b7f6e8880e3e474b06c79f10045503f6abfaf8b9193a60a"} Apr 16 19:56:28.698636 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:28.698266 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1c9a3071-156d-4e4d-ac1a-8e92162c293e","Type":"ContainerStarted","Data":"b370d284a75774ac5751c206e599f027a383093af3dce0dc8b4af233c7f2e329"} Apr 16 19:56:28.698636 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:28.698276 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1c9a3071-156d-4e4d-ac1a-8e92162c293e","Type":"ContainerStarted","Data":"30b29a09c1f7fbda456cb61628395a0db468b64b7cb0870e36728fbd86a3b96b"} Apr 16 19:56:28.698636 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:28.698286 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1c9a3071-156d-4e4d-ac1a-8e92162c293e","Type":"ContainerStarted","Data":"bfccaf3fbbee8f4b9f35e85cc283eacbbdc73fe7e6a5c0718fd2c5965f4b2dcd"} Apr 16 19:56:28.698636 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:28.698294 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1c9a3071-156d-4e4d-ac1a-8e92162c293e","Type":"ContainerStarted","Data":"df642e71a39e9c673a01a4a6a3bfa2eff5c0041d0137c6dfb3438dac4ec4a776"} Apr 16 19:56:28.725062 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:28.724891 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.724873553 podStartE2EDuration="2.724873553s" podCreationTimestamp="2026-04-16 19:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:28.723028637 +0000 UTC m=+150.206177570" watchObservedRunningTime="2026-04-16 19:56:28.724873553 +0000 UTC m=+150.208022473" Apr 16 19:56:32.053638 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:56:32.053603 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:27.053459 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:27.053425 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:27.068558 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:27.068532 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:27.873143 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:27.873114 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:57:28.866263 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.866231 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb"] Apr 16 19:57:28.869934 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.869904 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:28.873796 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.873776 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 19:57:28.874384 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.874364 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 19:57:28.874508 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.874366 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 19:57:28.874508 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.874368 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 19:57:28.874508 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.874368 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 19:57:28.874711 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.874696 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 19:57:28.874777 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.874730 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 19:57:28.885939 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.885918 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb"] Apr 16 19:57:28.973869 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.973820 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lgs2\" (UniqueName: \"kubernetes.io/projected/71376751-efdb-4579-8aa2-f95fe587fc92-kube-api-access-2lgs2\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:28.974039 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.973896 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:28.974109 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.974056 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-ca\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:28.974198 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.974179 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-hub\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:28.974347 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.974327 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/71376751-efdb-4579-8aa2-f95fe587fc92-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:28.974437 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:28.974415 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.075255 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.075212 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lgs2\" (UniqueName: \"kubernetes.io/projected/71376751-efdb-4579-8aa2-f95fe587fc92-kube-api-access-2lgs2\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.075255 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.075263 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.075437 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.075309 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-ca\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.075437 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.075332 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-hub\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.075437 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.075352 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/71376751-efdb-4579-8aa2-f95fe587fc92-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.075437 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.075385 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.076121 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.076093 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/71376751-efdb-4579-8aa2-f95fe587fc92-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.077872 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.077852 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-ca\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.078110 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.078090 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.078157 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.078103 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-hub\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.078238 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.078219 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/71376751-efdb-4579-8aa2-f95fe587fc92-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.083981 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.083958 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lgs2\" (UniqueName: \"kubernetes.io/projected/71376751-efdb-4579-8aa2-f95fe587fc92-kube-api-access-2lgs2\") pod \"cluster-proxy-proxy-agent-77dd46d4fd-8xrtb\" (UID: \"71376751-efdb-4579-8aa2-f95fe587fc92\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.199061 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.199034 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" Apr 16 19:57:29.315071 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.315044 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb"] Apr 16 19:57:29.318079 ip-10-0-130-164 kubenswrapper[2561]: W0416 19:57:29.318043 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71376751_efdb_4579_8aa2_f95fe587fc92.slice/crio-5a87b224132fd2b88ad458afb6b83effe0bcc843faefbdb4a7a3913006366fc8 WatchSource:0}: Error finding container 5a87b224132fd2b88ad458afb6b83effe0bcc843faefbdb4a7a3913006366fc8: Status 404 returned error can't find the container with id 5a87b224132fd2b88ad458afb6b83effe0bcc843faefbdb4a7a3913006366fc8 Apr 16 19:57:29.864085 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:29.864040 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" event={"ID":"71376751-efdb-4579-8aa2-f95fe587fc92","Type":"ContainerStarted","Data":"5a87b224132fd2b88ad458afb6b83effe0bcc843faefbdb4a7a3913006366fc8"} Apr 16 19:57:32.875077 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:32.875038 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" event={"ID":"71376751-efdb-4579-8aa2-f95fe587fc92","Type":"ContainerStarted","Data":"a58cdb31c6e6eb934ea828cedc20dfefdf7445269443dd34c5f123f601731b86"} Apr 16 19:57:34.882517 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:34.882429 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" event={"ID":"71376751-efdb-4579-8aa2-f95fe587fc92","Type":"ContainerStarted","Data":"6ce84cb9a7fd88a189949a4e8539b57d3c0592c929db302e37b9be7d9e95c7c2"} Apr 16 19:57:34.882517 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:34.882469 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" event={"ID":"71376751-efdb-4579-8aa2-f95fe587fc92","Type":"ContainerStarted","Data":"ee057871b015688b339afd89272eda9356a0626ab216100eab9ca14e42a0a1d0"} Apr 16 19:57:34.900414 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:57:34.900367 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-77dd46d4fd-8xrtb" podStartSLOduration=2.311373097 podStartE2EDuration="6.900351738s" podCreationTimestamp="2026-04-16 19:57:28 +0000 UTC" firstStartedPulling="2026-04-16 19:57:29.320264096 +0000 UTC m=+210.803412994" lastFinishedPulling="2026-04-16 19:57:33.909242738 +0000 UTC m=+215.392391635" observedRunningTime="2026-04-16 19:57:34.899207055 +0000 UTC m=+216.382355984" watchObservedRunningTime="2026-04-16 19:57:34.900351738 +0000 UTC m=+216.383500658" Apr 16 19:58:59.042075 ip-10-0-130-164 kubenswrapper[2561]: I0416 19:58:59.042046 2561 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:40:59.656737 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.656696 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zn2n8/must-gather-4bv4f"] Apr 16 20:40:59.659572 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.659554 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zn2n8/must-gather-4bv4f" Apr 16 20:40:59.661730 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.661707 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zn2n8\"/\"default-dockercfg-n9vjb\"" Apr 16 20:40:59.661944 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.661930 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zn2n8\"/\"openshift-service-ca.crt\"" Apr 16 20:40:59.662518 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.662497 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zn2n8\"/\"kube-root-ca.crt\"" Apr 16 20:40:59.667512 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.667493 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zn2n8/must-gather-4bv4f"] Apr 16 20:40:59.818960 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.818912 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8eddd857-ff74-4fd5-b129-1c4ac7b024b5-must-gather-output\") pod \"must-gather-4bv4f\" (UID: \"8eddd857-ff74-4fd5-b129-1c4ac7b024b5\") " pod="openshift-must-gather-zn2n8/must-gather-4bv4f" Apr 16 20:40:59.819144 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.818980 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-455nr\" (UniqueName: \"kubernetes.io/projected/8eddd857-ff74-4fd5-b129-1c4ac7b024b5-kube-api-access-455nr\") pod \"must-gather-4bv4f\" (UID: \"8eddd857-ff74-4fd5-b129-1c4ac7b024b5\") " pod="openshift-must-gather-zn2n8/must-gather-4bv4f" Apr 16 20:40:59.920226 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.920191 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8eddd857-ff74-4fd5-b129-1c4ac7b024b5-must-gather-output\") pod \"must-gather-4bv4f\" (UID: \"8eddd857-ff74-4fd5-b129-1c4ac7b024b5\") " pod="openshift-must-gather-zn2n8/must-gather-4bv4f" Apr 16 20:40:59.920338 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.920261 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-455nr\" (UniqueName: \"kubernetes.io/projected/8eddd857-ff74-4fd5-b129-1c4ac7b024b5-kube-api-access-455nr\") pod \"must-gather-4bv4f\" (UID: \"8eddd857-ff74-4fd5-b129-1c4ac7b024b5\") " pod="openshift-must-gather-zn2n8/must-gather-4bv4f" Apr 16 20:40:59.920541 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.920525 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8eddd857-ff74-4fd5-b129-1c4ac7b024b5-must-gather-output\") pod \"must-gather-4bv4f\" (UID: \"8eddd857-ff74-4fd5-b129-1c4ac7b024b5\") " pod="openshift-must-gather-zn2n8/must-gather-4bv4f" Apr 16 20:40:59.928608 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.928589 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-455nr\" (UniqueName: \"kubernetes.io/projected/8eddd857-ff74-4fd5-b129-1c4ac7b024b5-kube-api-access-455nr\") pod \"must-gather-4bv4f\" (UID: \"8eddd857-ff74-4fd5-b129-1c4ac7b024b5\") " pod="openshift-must-gather-zn2n8/must-gather-4bv4f" Apr 16 20:40:59.969422 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:40:59.969393 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zn2n8/must-gather-4bv4f" Apr 16 20:41:00.084913 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:00.084879 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zn2n8/must-gather-4bv4f"] Apr 16 20:41:00.087980 ip-10-0-130-164 kubenswrapper[2561]: W0416 20:41:00.087954 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eddd857_ff74_4fd5_b129_1c4ac7b024b5.slice/crio-a9c2e48a0b36768c0d98eb480d480dbc55707efdde8c9264eb6ba07183402cff WatchSource:0}: Error finding container a9c2e48a0b36768c0d98eb480d480dbc55707efdde8c9264eb6ba07183402cff: Status 404 returned error can't find the container with id a9c2e48a0b36768c0d98eb480d480dbc55707efdde8c9264eb6ba07183402cff Apr 16 20:41:00.089593 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:00.089577 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:41:00.781571 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:00.781532 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zn2n8/must-gather-4bv4f" event={"ID":"8eddd857-ff74-4fd5-b129-1c4ac7b024b5","Type":"ContainerStarted","Data":"a9c2e48a0b36768c0d98eb480d480dbc55707efdde8c9264eb6ba07183402cff"} Apr 16 20:41:01.785975 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:01.785933 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zn2n8/must-gather-4bv4f" event={"ID":"8eddd857-ff74-4fd5-b129-1c4ac7b024b5","Type":"ContainerStarted","Data":"8bd3f1febd26ee3c9b1951520f943bb3051cb5ed17fb345556328543fa095e71"} Apr 16 20:41:01.785975 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:01.785980 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zn2n8/must-gather-4bv4f" event={"ID":"8eddd857-ff74-4fd5-b129-1c4ac7b024b5","Type":"ContainerStarted","Data":"d38bae884b55f2bf1bbcaa64be67df4d0bd9bf890f2719a1199b5816052b11f9"} Apr 16 20:41:01.801915 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:01.801823 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zn2n8/must-gather-4bv4f" podStartSLOduration=1.855144675 podStartE2EDuration="2.801805222s" podCreationTimestamp="2026-04-16 20:40:59 +0000 UTC" firstStartedPulling="2026-04-16 20:41:00.089706266 +0000 UTC m=+2821.572855163" lastFinishedPulling="2026-04-16 20:41:01.036366813 +0000 UTC m=+2822.519515710" observedRunningTime="2026-04-16 20:41:01.80108408 +0000 UTC m=+2823.284233002" watchObservedRunningTime="2026-04-16 20:41:01.801805222 +0000 UTC m=+2823.284954142" Apr 16 20:41:02.424730 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:02.424702 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dghfv_65540f57-d96d-4dcd-b7a1-010ff5e40cae/global-pull-secret-syncer/0.log" Apr 16 20:41:02.532414 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:02.532383 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h2cnc_8890691e-8929-4051-8642-9a4556f51961/konnectivity-agent/0.log" Apr 16 20:41:02.649141 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:02.649110 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-164.ec2.internal_3a0a5c69e63ac560ca3e107bf62451fd/haproxy/0.log" Apr 16 20:41:06.309012 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.308906 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-spv5g_22ae2e7e-371a-42e0-a0a9-0a3dc249db95/node-exporter/0.log" Apr 16 20:41:06.329863 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.329782 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-spv5g_22ae2e7e-371a-42e0-a0a9-0a3dc249db95/kube-rbac-proxy/0.log" Apr 16 20:41:06.352664 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.352636 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-spv5g_22ae2e7e-371a-42e0-a0a9-0a3dc249db95/init-textfile/0.log" Apr 16 20:41:06.380774 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.380747 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5549q_b3e122c2-a77f-466b-b593-7d6143d118cd/kube-rbac-proxy-main/0.log" Apr 16 20:41:06.401310 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.401286 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5549q_b3e122c2-a77f-466b-b593-7d6143d118cd/kube-rbac-proxy-self/0.log" Apr 16 20:41:06.426380 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.426352 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5549q_b3e122c2-a77f-466b-b593-7d6143d118cd/openshift-state-metrics/0.log" Apr 16 20:41:06.477527 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.477491 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1c9a3071-156d-4e4d-ac1a-8e92162c293e/prometheus/0.log" Apr 16 20:41:06.495118 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.495092 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1c9a3071-156d-4e4d-ac1a-8e92162c293e/config-reloader/0.log" Apr 16 20:41:06.517168 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.517139 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1c9a3071-156d-4e4d-ac1a-8e92162c293e/thanos-sidecar/0.log" Apr 16 20:41:06.538567 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.538539 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1c9a3071-156d-4e4d-ac1a-8e92162c293e/kube-rbac-proxy-web/0.log" Apr 16 20:41:06.561489 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.561403 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1c9a3071-156d-4e4d-ac1a-8e92162c293e/kube-rbac-proxy/0.log" Apr 16 20:41:06.583763 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.583737 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1c9a3071-156d-4e4d-ac1a-8e92162c293e/kube-rbac-proxy-thanos/0.log" Apr 16 20:41:06.605978 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.605954 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1c9a3071-156d-4e4d-ac1a-8e92162c293e/init-config-reloader/0.log" Apr 16 20:41:06.782611 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.782578 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bb845d988-9wk9x_3f3e2227-a0ef-40fe-ba36-6816e80bce5c/thanos-query/0.log" Apr 16 20:41:06.805402 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.805369 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bb845d988-9wk9x_3f3e2227-a0ef-40fe-ba36-6816e80bce5c/kube-rbac-proxy-web/0.log" Apr 16 20:41:06.827663 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.827596 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bb845d988-9wk9x_3f3e2227-a0ef-40fe-ba36-6816e80bce5c/kube-rbac-proxy/0.log" Apr 16 20:41:06.850949 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.850920 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bb845d988-9wk9x_3f3e2227-a0ef-40fe-ba36-6816e80bce5c/prom-label-proxy/0.log" Apr 16 20:41:06.876540 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.876517 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bb845d988-9wk9x_3f3e2227-a0ef-40fe-ba36-6816e80bce5c/kube-rbac-proxy-rules/0.log" Apr 16 20:41:06.897456 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:06.897427 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-bb845d988-9wk9x_3f3e2227-a0ef-40fe-ba36-6816e80bce5c/kube-rbac-proxy-metrics/0.log" Apr 16 20:41:09.645798 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.645762 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47"] Apr 16 20:41:09.649995 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.649970 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.656368 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.656334 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47"] Apr 16 20:41:09.712767 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.712734 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vqn\" (UniqueName: \"kubernetes.io/projected/e2043cc7-e358-40b2-a110-db58dada5db0-kube-api-access-p6vqn\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.712959 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.712805 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-proc\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.712959 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.712886 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-lib-modules\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.712959 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.712929 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-sys\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.713126 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.712961 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-podres\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.814034 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.813998 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-proc\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.814034 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.814035 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-lib-modules\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.814268 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.814061 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-sys\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.814268 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.814083 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-podres\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.814268 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.814122 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-proc\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.814268 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.814127 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vqn\" (UniqueName: \"kubernetes.io/projected/e2043cc7-e358-40b2-a110-db58dada5db0-kube-api-access-p6vqn\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.814268 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.814146 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-sys\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.814268 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.814182 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-lib-modules\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.814268 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.814234 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e2043cc7-e358-40b2-a110-db58dada5db0-podres\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.821538 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.821513 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vqn\" (UniqueName: \"kubernetes.io/projected/e2043cc7-e358-40b2-a110-db58dada5db0-kube-api-access-p6vqn\") pod \"perf-node-gather-daemonset-mzv47\" (UID: \"e2043cc7-e358-40b2-a110-db58dada5db0\") " pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:09.963549 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:09.963508 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:10.080250 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.080219 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47"] Apr 16 20:41:10.083620 ip-10-0-130-164 kubenswrapper[2561]: W0416 20:41:10.083591 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode2043cc7_e358_40b2_a110_db58dada5db0.slice/crio-74d04e5a73a7d96d59c70724d25490e778cbb7dd7edc8735f6d1336678a00f91 WatchSource:0}: Error finding container 74d04e5a73a7d96d59c70724d25490e778cbb7dd7edc8735f6d1336678a00f91: Status 404 returned error can't find the container with id 74d04e5a73a7d96d59c70724d25490e778cbb7dd7edc8735f6d1336678a00f91 Apr 16 20:41:10.094261 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.094241 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pzd2h_854ee71f-1cd4-4645-9909-8f0da088e901/dns/0.log" Apr 16 20:41:10.116950 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.116922 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pzd2h_854ee71f-1cd4-4645-9909-8f0da088e901/kube-rbac-proxy/0.log" Apr 16 20:41:10.140231 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.140214 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bz8k8_cba09151-5332-4eae-8b38-4f1cfe937ce5/dns-node-resolver/0.log" Apr 16 20:41:10.570282 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.570255 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-68b985c7d7-k2vd9_f8f97a0d-d529-41d9-b1dd-4aa61965968d/registry/0.log" Apr 16 20:41:10.635112 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.635087 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xbhwz_012be36b-5039-4ab8-82de-702be926779a/node-ca/0.log" Apr 16 20:41:10.817177 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.817129 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" event={"ID":"e2043cc7-e358-40b2-a110-db58dada5db0","Type":"ContainerStarted","Data":"32bd7a3af9bddb2fbdf31f5f2598285f39823b0f02131f5c5674b94c28a5214b"} Apr 16 20:41:10.817177 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.817168 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" event={"ID":"e2043cc7-e358-40b2-a110-db58dada5db0","Type":"ContainerStarted","Data":"74d04e5a73a7d96d59c70724d25490e778cbb7dd7edc8735f6d1336678a00f91"} Apr 16 20:41:10.817846 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.817272 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:10.832551 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:10.832442 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" podStartSLOduration=1.832427461 podStartE2EDuration="1.832427461s" podCreationTimestamp="2026-04-16 20:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:41:10.830802367 +0000 UTC m=+2832.313951287" watchObservedRunningTime="2026-04-16 20:41:10.832427461 +0000 UTC m=+2832.315576379" Apr 16 20:41:11.634464 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:11.634433 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-htqsk_3d9dabfc-3f32-488d-81d1-0da63416b50c/serve-healthcheck-canary/0.log" Apr 16 20:41:12.028537 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:12.028510 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-89xkw_36f87ce1-84b3-4474-9bf8-1034549406d7/kube-rbac-proxy/0.log" Apr 16 20:41:12.050909 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:12.050885 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-89xkw_36f87ce1-84b3-4474-9bf8-1034549406d7/exporter/0.log" Apr 16 20:41:12.078579 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:12.078553 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-89xkw_36f87ce1-84b3-4474-9bf8-1034549406d7/extractor/0.log" Apr 16 20:41:16.830632 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:16.830607 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zn2n8/perf-node-gather-daemonset-mzv47" Apr 16 20:41:19.890532 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:19.890503 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6j2q7_87e620d6-c02b-44e6-897b-45e0488dc88a/kube-multus/0.log" Apr 16 20:41:20.062183 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:20.062157 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c96xs_c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c/kube-multus-additional-cni-plugins/0.log" Apr 16 20:41:20.083198 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:20.083175 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c96xs_c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c/egress-router-binary-copy/0.log" Apr 16 20:41:20.105434 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:20.105409 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c96xs_c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c/cni-plugins/0.log" Apr 16 20:41:20.127774 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:20.127748 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c96xs_c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c/bond-cni-plugin/0.log" Apr 16 20:41:20.150621 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:20.150553 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c96xs_c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c/routeoverride-cni/0.log" Apr 16 20:41:20.173189 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:20.173161 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c96xs_c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c/whereabouts-cni-bincopy/0.log" Apr 16 20:41:20.194694 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:20.194668 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c96xs_c7bfe2d4-039f-4e5e-bd3c-5295e58ee27c/whereabouts-cni/0.log" Apr 16 20:41:20.384940 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:20.384913 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hkqtk_a0ce8326-ca5a-49b9-90c0-94db13c2c74e/network-metrics-daemon/0.log" Apr 16 20:41:20.407115 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:20.407043 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hkqtk_a0ce8326-ca5a-49b9-90c0-94db13c2c74e/kube-rbac-proxy/0.log" Apr 16 20:41:21.880971 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:21.880945 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5dmh_c5e7c7d5-574f-4907-8f05-2b58d5c7118f/ovn-controller/0.log" Apr 16 20:41:21.915805 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:21.915780 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5dmh_c5e7c7d5-574f-4907-8f05-2b58d5c7118f/ovn-acl-logging/0.log" Apr 16 20:41:21.936109 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:21.936073 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5dmh_c5e7c7d5-574f-4907-8f05-2b58d5c7118f/kube-rbac-proxy-node/0.log" Apr 16 20:41:21.955713 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:21.955679 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5dmh_c5e7c7d5-574f-4907-8f05-2b58d5c7118f/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 20:41:21.975393 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:21.975373 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5dmh_c5e7c7d5-574f-4907-8f05-2b58d5c7118f/northd/0.log" Apr 16 20:41:21.995819 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:21.995795 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5dmh_c5e7c7d5-574f-4907-8f05-2b58d5c7118f/nbdb/0.log" Apr 16 20:41:22.016295 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:22.016257 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5dmh_c5e7c7d5-574f-4907-8f05-2b58d5c7118f/sbdb/0.log" Apr 16 20:41:22.149906 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:22.149838 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-t5dmh_c5e7c7d5-574f-4907-8f05-2b58d5c7118f/ovnkube-controller/0.log" Apr 16 20:41:23.091788 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:23.091756 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jkgr6_3e96a428-4bd2-4a4f-b624-974f68f14131/network-check-target-container/0.log" Apr 16 20:41:23.997644 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:23.997615 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-ck9lp_8b791b2a-d9bd-4b03-8a90-5f7d696bf7d5/iptables-alerter/0.log" Apr 16 20:41:24.638457 ip-10-0-130-164 kubenswrapper[2561]: I0416 20:41:24.638432 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-t9q6h_5f9ec80f-6c8e-4cdf-989c-4c357a66efe8/tuned/0.log"