Apr 16 17:38:08.583933 ip-10-0-133-244 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 17:38:08.583943 ip-10-0-133-244 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 17:38:08.583982 ip-10-0-133-244 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 17:38:08.584344 ip-10-0-133-244 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 17:38:18.658632 ip-10-0-133-244 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 17:38:18.658652 ip-10-0-133-244 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 0a8c67af560e43a9b721f6a92cf98e98 -- Apr 16 17:40:39.630064 ip-10-0-133-244 systemd[1]: Starting Kubernetes Kubelet... Apr 16 17:40:40.070345 ip-10-0-133-244 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:40.070345 ip-10-0-133-244 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 17:40:40.070345 ip-10-0-133-244 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:40.070345 ip-10-0-133-244 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 17:40:40.070345 ip-10-0-133-244 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:40.071131 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.070991 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 17:40:40.077859 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077833 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:40.077859 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077852 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:40.077859 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077856 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:40.077859 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077860 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:40.077859 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077863 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:40.077859 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077866 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:40.077859 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077869 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077872 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077876 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077880 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077883 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077886 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077889 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077892 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077894 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077897 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077900 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077902 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077905 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077910 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077913 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077915 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077918 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077921 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077923 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077927 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:40.078124 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077929 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077932 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077935 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077937 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077940 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077942 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077945 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077947 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077950 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077952 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077955 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077957 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077960 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077963 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077968 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077972 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077976 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077979 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077982 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:40.078636 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077985 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077987 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077990 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077992 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077995 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077997 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.077999 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078002 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078004 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078007 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078009 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078012 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078015 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078017 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078019 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078023 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078025 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078027 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078030 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078033 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:40.079136 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078035 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078038 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078040 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078042 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078046 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078048 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078050 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078054 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078056 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078061 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078065 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078068 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078070 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078075 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078077 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078082 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078085 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078088 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078091 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:40.079619 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078094 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078097 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078513 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078519 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078522 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078525 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078528 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078531 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078534 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078537 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078540 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078542 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078545 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078547 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078550 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078552 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078555 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078557 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078560 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078564 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:40.080112 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078568 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078572 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078574 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078577 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078579 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078582 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078585 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078588 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078590 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078593 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078595 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078598 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078600 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078603 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078605 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078608 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078610 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078613 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078615 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:40.080591 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078619 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078621 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078624 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078626 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078629 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078631 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078633 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078636 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078638 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078641 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078644 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078646 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078649 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078652 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078655 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078657 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078660 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078663 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078665 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:40.081119 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078668 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078674 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078677 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078680 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078682 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078684 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078687 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078689 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078720 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078734 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078737 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078739 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078742 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078744 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078747 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078750 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078752 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078756 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078760 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078763 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:40.081584 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078766 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078769 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078771 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078774 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078777 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078779 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078784 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078788 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078791 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.078794 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080441 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080450 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080457 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080462 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080468 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080471 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080476 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080480 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080483 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080486 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080490 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 17:40:40.082100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080493 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080496 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080499 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080502 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080505 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080508 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080511 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080513 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080519 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080522 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080525 2579 flags.go:64] FLAG: --config-dir="" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080528 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080532 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080536 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080539 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080542 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080546 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080549 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080553 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080556 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080559 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080562 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080566 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080569 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080572 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 17:40:40.082625 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080575 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080579 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080582 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080586 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080589 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080592 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080595 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080598 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080602 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080605 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080608 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080611 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080614 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080617 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080620 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080623 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080626 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080629 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080632 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080636 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080639 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080642 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080645 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080649 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080652 2579 flags.go:64] FLAG: --help="false" Apr 16 17:40:40.083252 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080656 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080659 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080662 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080665 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080668 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080672 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080675 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080678 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080681 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080684 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080687 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080690 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080693 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080724 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080729 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080732 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080735 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080738 2579 flags.go:64] FLAG: --lock-file="" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080741 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080744 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080747 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080752 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080755 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080758 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 17:40:40.083879 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080761 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080765 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080768 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080771 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080775 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080779 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080782 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080786 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080789 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080793 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080796 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080798 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080801 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080804 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080807 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080815 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080818 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080821 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080825 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080828 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080834 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080837 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080840 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080843 2579 flags.go:64] FLAG: --port="10250" Apr 16 17:40:40.084441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080846 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080849 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-033c54b1ff9a92e34" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080852 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080855 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080858 2579 flags.go:64] FLAG: --register-node="true" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080860 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080863 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080870 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080874 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080877 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080880 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080884 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080887 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080890 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080892 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080895 2579 flags.go:64] FLAG: --runonce="false" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080898 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080901 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080904 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080907 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080910 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080913 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080916 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080919 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080922 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080925 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 17:40:40.085049 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080927 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080931 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080934 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080937 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080939 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080944 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080948 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080951 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080955 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080958 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080961 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080964 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080966 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080969 2579 flags.go:64] FLAG: --v="2" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080974 2579 flags.go:64] FLAG: --version="false" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080978 2579 flags.go:64] FLAG: --vmodule="" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080983 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.080986 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081079 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081083 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081086 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081089 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081092 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081095 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:40.085797 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081098 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081100 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081103 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081106 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081109 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081112 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081115 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081117 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081120 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081123 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081127 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081129 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081132 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081135 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081137 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081140 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081143 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081145 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081148 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081150 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:40.086373 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081153 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081155 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081158 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081160 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081163 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081165 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081169 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081173 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081176 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081179 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081181 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081184 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081186 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081189 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081191 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081194 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081196 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081199 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081204 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:40.086931 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081206 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081209 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081212 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081215 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081218 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081220 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081223 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081227 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081231 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081233 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081236 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081238 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081241 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081243 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081246 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081249 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081252 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081254 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081256 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081259 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:40.087390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081261 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081264 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081266 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081269 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081272 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081274 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081276 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081279 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081281 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081284 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081286 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081290 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081292 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081295 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081297 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081300 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081302 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081305 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081308 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081310 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:40.087901 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.081313 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.081858 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.088083 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.088100 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088148 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088152 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088155 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088159 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088161 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088164 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088167 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088170 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088174 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088179 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088182 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:40.088390 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088184 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088187 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088190 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088193 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088196 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088198 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088201 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088204 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088206 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088209 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088212 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088214 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088217 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088220 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088222 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088225 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088227 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088230 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088233 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088235 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:40.088804 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088240 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088242 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088245 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088248 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088250 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088253 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088256 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088258 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088261 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088263 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088266 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088268 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088271 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088273 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088276 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088278 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088281 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088283 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088286 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088288 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:40.089295 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088291 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088294 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088296 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088299 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088301 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088304 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088306 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088309 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088311 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088313 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088316 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088318 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088322 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088325 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088328 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088330 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088333 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088335 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088338 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088340 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:40.089836 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088344 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088348 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088351 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088354 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088356 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088359 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088361 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088363 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088366 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088368 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088371 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088374 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088376 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088378 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088381 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.088386 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:40.090354 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088492 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088498 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088501 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088503 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088506 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088509 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088512 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088514 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088517 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088521 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088524 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088527 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088529 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088532 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088534 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088537 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088539 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088542 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088544 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088546 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:40.090774 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088549 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088552 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088554 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088558 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088562 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088565 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088567 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088570 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088572 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088574 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088577 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088579 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088582 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088584 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088586 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088589 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088592 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088594 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088597 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:40.091268 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088599 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088602 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088605 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088607 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088611 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088613 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088615 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088618 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088621 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088623 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088625 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088629 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088632 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088635 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088638 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088641 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088643 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088646 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088648 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:40.091745 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088651 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088653 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088655 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088658 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088661 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088663 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088666 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088668 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088670 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088674 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088676 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088679 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088681 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088683 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088686 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088689 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088691 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088693 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088697 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088716 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:40.092204 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088719 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:40.092725 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088722 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:40.092725 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088725 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:40.092725 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088727 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:40.092725 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088730 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:40.092725 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088733 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:40.092725 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088735 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:40.092725 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:40.088738 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:40.092725 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.088743 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:40.092725 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.089427 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 17:40:40.093807 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.093753 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 17:40:40.094770 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.094756 2579 server.go:1019] "Starting client certificate rotation" Apr 16 17:40:40.094875 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.094860 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:40:40.094911 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.094902 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:40:40.118627 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.118599 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:40:40.122785 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.122768 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:40:40.137047 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.137025 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 17:40:40.142121 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.142098 2579 log.go:25] "Validated CRI v1 image API" Apr 16 17:40:40.144643 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.144620 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 17:40:40.149695 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.149673 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 e6536947-0039-4c9f-b37a-b41fc97dd2a3:/dev/nvme0n1p3 e6a0bcb1-19a3-43ea-8973-dcc2ce6167d0:/dev/nvme0n1p4] Apr 16 17:40:40.149914 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.149695 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 17:40:40.152562 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.152540 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:40:40.155581 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.155461 2579 manager.go:217] Machine: {Timestamp:2026-04-16 17:40:40.153634096 +0000 UTC m=+0.405048911 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098948 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d05abc747cc3b0704afc0d99e2c81 SystemUUID:ec2d05ab-c747-cc3b-0704-afc0d99e2c81 BootID:0a8c67af-560e-43a9-b721-f6a92cf98e98 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b4:14:d1:6c:61 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b4:14:d1:6c:61 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:76:b4:a2:47:b6:22 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 17:40:40.155581 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.155570 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 17:40:40.155767 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.155686 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 17:40:40.159142 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.159113 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 17:40:40.159311 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.159145 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-244.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 17:40:40.159394 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.159326 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 17:40:40.159394 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.159337 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 17:40:40.159394 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.159356 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:40:40.160126 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.160112 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:40:40.161358 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.161346 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:40:40.161485 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.161474 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 17:40:40.163756 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.163745 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 17:40:40.163822 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.163763 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 17:40:40.163822 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.163781 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 17:40:40.163822 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.163794 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 17:40:40.163822 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.163817 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 17:40:40.164859 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.164846 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:40:40.164930 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.164869 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:40:40.167604 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.167588 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 17:40:40.169009 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.168992 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 17:40:40.170048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170035 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 17:40:40.170125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170053 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 17:40:40.170125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170063 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 17:40:40.170125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170070 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 17:40:40.170125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170078 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 17:40:40.170125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170087 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 17:40:40.170125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170097 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 17:40:40.170125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170105 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 17:40:40.170125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170117 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 17:40:40.170125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170125 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 17:40:40.170395 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170138 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 17:40:40.170395 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170151 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 17:40:40.170950 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170938 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 17:40:40.170995 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.170954 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 17:40:40.171521 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.171502 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kzgjc" Apr 16 17:40:40.173042 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.172577 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 17:40:40.173377 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.173342 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-244.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 17:40:40.175217 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.175199 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-244.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 17:40:40.175463 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.175449 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 17:40:40.175513 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.175490 2579 server.go:1295] "Started kubelet" Apr 16 17:40:40.175620 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.175572 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 17:40:40.175686 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.175637 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 17:40:40.175756 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.175721 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 17:40:40.176448 ip-10-0-133-244 systemd[1]: Started Kubernetes Kubelet. Apr 16 17:40:40.176857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.176830 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 17:40:40.178048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.178035 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 17:40:40.180233 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.180209 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kzgjc" Apr 16 17:40:40.181140 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.181125 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 17:40:40.181586 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.181573 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 17:40:40.182623 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.182146 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 17:40:40.182623 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.182167 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 17:40:40.182623 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.182266 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 17:40:40.182623 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.182338 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 17:40:40.182623 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.182346 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 17:40:40.182954 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.182671 2579 factory.go:153] Registering CRI-O factory Apr 16 17:40:40.182954 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.182686 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 17:40:40.183169 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.181986 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-244.ec2.internal.18a6e722e96e503c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-244.ec2.internal,UID:ip-10-0-133-244.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-244.ec2.internal,},FirstTimestamp:2026-04-16 17:40:40.175464508 +0000 UTC m=+0.426879324,LastTimestamp:2026-04-16 17:40:40.175464508 +0000 UTC m=+0.426879324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-244.ec2.internal,}" Apr 16 17:40:40.185059 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.185037 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 17:40:40.185059 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.185047 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:40.185209 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.185058 2579 factory.go:55] Registering systemd factory Apr 16 17:40:40.185209 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.185083 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 17:40:40.185209 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.185120 2579 factory.go:103] Registering Raw factory Apr 16 17:40:40.185209 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.185134 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 17:40:40.185575 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.185559 2579 manager.go:319] Starting recovery of all containers Apr 16 17:40:40.186011 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.185966 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 17:40:40.192768 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.190986 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:40.195157 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.195031 2579 manager.go:324] Recovery completed Apr 16 17:40:40.195243 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.195158 2579 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-244.ec2.internal\" not found" node="ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.197337 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.197238 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": readdirent /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 17:40:40.200903 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.200890 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:40.203319 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.203303 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:40.203394 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.203334 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:40.203394 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.203344 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:40.203798 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.203786 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 17:40:40.203874 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.203797 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 17:40:40.203874 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.203817 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:40:40.205778 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.205765 2579 policy_none.go:49] "None policy: Start" Apr 16 17:40:40.205838 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.205783 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 17:40:40.205838 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.205796 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 17:40:40.239344 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.239327 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 17:40:40.255183 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.239368 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 17:40:40.255183 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.239381 2579 server.go:85] "Starting device plugin registration server" Apr 16 17:40:40.255183 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.239621 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 17:40:40.255183 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.239632 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 17:40:40.255183 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.239759 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 17:40:40.255183 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.239865 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 17:40:40.255183 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.239876 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 17:40:40.255183 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.240403 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 17:40:40.255183 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.240438 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:40.281100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.281074 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 17:40:40.282389 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.282366 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 17:40:40.282488 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.282393 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 17:40:40.282488 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.282409 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 17:40:40.282488 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.282417 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 17:40:40.282488 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.282452 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 17:40:40.289780 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.289760 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:40.340047 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.339980 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:40.340923 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.340907 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:40.341003 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.340938 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:40.341003 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.340954 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:40.341003 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.340981 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.354367 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.354344 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.354367 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.354367 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-244.ec2.internal\": node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:40.383054 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.383026 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal"] Apr 16 17:40:40.383170 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.383090 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:40.384046 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.384032 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:40.384119 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.384062 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:40.384119 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.384077 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:40.385215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.385203 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:40.385371 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.385356 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.385427 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.385395 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:40.385946 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.385923 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:40.386048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.385950 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:40.386048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.385961 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:40.386048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.385923 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:40.386048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.386027 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:40.386048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.386042 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:40.387602 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.387586 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.387662 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.387620 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:40.388296 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.388283 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:40.388362 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.388305 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:40.388362 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.388315 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:40.408264 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.408242 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:40.413338 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.413324 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-244.ec2.internal\" not found" node="ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.417630 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.417615 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-244.ec2.internal\" not found" node="ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.483689 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.483648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28c03fc8170b3947ca7170efad520626-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal\" (UID: \"28c03fc8170b3947ca7170efad520626\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.508518 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.508472 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:40.584305 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.584267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28c03fc8170b3947ca7170efad520626-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal\" (UID: \"28c03fc8170b3947ca7170efad520626\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.584305 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.584307 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28c03fc8170b3947ca7170efad520626-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal\" (UID: \"28c03fc8170b3947ca7170efad520626\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.584498 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.584333 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69c8b1379aebe31aba3bde3fe6d1e4ea-config\") pod \"kube-apiserver-proxy-ip-10-0-133-244.ec2.internal\" (UID: \"69c8b1379aebe31aba3bde3fe6d1e4ea\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.584498 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.584370 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28c03fc8170b3947ca7170efad520626-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal\" (UID: \"28c03fc8170b3947ca7170efad520626\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.609404 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.609337 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:40.685006 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.684982 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28c03fc8170b3947ca7170efad520626-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal\" (UID: \"28c03fc8170b3947ca7170efad520626\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.685073 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.685011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69c8b1379aebe31aba3bde3fe6d1e4ea-config\") pod \"kube-apiserver-proxy-ip-10-0-133-244.ec2.internal\" (UID: \"69c8b1379aebe31aba3bde3fe6d1e4ea\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.685073 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.685048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/69c8b1379aebe31aba3bde3fe6d1e4ea-config\") pod \"kube-apiserver-proxy-ip-10-0-133-244.ec2.internal\" (UID: \"69c8b1379aebe31aba3bde3fe6d1e4ea\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.685138 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.685081 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28c03fc8170b3947ca7170efad520626-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal\" (UID: \"28c03fc8170b3947ca7170efad520626\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.710119 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.710092 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:40.716289 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.716266 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.719788 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:40.719772 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal" Apr 16 17:40:40.810694 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.810644 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:40.911188 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:40.911117 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:41.011613 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:41.011580 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:41.094181 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.094151 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 17:40:41.094721 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.094299 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:40:41.094721 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.094304 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:40:41.112314 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:41.112279 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:41.181696 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.181635 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 17:40:41.182211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.182185 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:35:40 +0000 UTC" deadline="2027-09-14 09:54:09.879865681 +0000 UTC" Apr 16 17:40:41.182211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.182211 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12376h13m28.697657707s" Apr 16 17:40:41.192517 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:41.192485 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c8b1379aebe31aba3bde3fe6d1e4ea.slice/crio-569ee00b691787162f1d9cf1b339f039fa78e7072a7b23997d200ef03b18f65b WatchSource:0}: Error finding container 569ee00b691787162f1d9cf1b339f039fa78e7072a7b23997d200ef03b18f65b: Status 404 returned error can't find the container with id 569ee00b691787162f1d9cf1b339f039fa78e7072a7b23997d200ef03b18f65b Apr 16 17:40:41.192917 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:41.192899 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c03fc8170b3947ca7170efad520626.slice/crio-293012cb94501f7c0648389742b169987082d82780c90574bf51b91edcdeb4b6 WatchSource:0}: Error finding container 293012cb94501f7c0648389742b169987082d82780c90574bf51b91edcdeb4b6: Status 404 returned error can't find the container with id 293012cb94501f7c0648389742b169987082d82780c90574bf51b91edcdeb4b6 Apr 16 17:40:41.197183 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.197164 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:40:41.198042 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.198023 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:40:41.213051 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:41.213030 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-244.ec2.internal\" not found" Apr 16 17:40:41.221142 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.221123 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-r9hv9" Apr 16 17:40:41.228254 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.228241 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-r9hv9" Apr 16 17:40:41.242386 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.242365 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:41.283045 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.283019 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" Apr 16 17:40:41.285901 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.285859 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" event={"ID":"28c03fc8170b3947ca7170efad520626","Type":"ContainerStarted","Data":"293012cb94501f7c0648389742b169987082d82780c90574bf51b91edcdeb4b6"} Apr 16 17:40:41.286831 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.286811 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal" event={"ID":"69c8b1379aebe31aba3bde3fe6d1e4ea","Type":"ContainerStarted","Data":"569ee00b691787162f1d9cf1b339f039fa78e7072a7b23997d200ef03b18f65b"} Apr 16 17:40:41.300440 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.300423 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:40:41.302126 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.302113 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal" Apr 16 17:40:41.311489 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.311477 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:40:41.369210 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:41.369187 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:42.164896 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.164861 2579 apiserver.go:52] "Watching apiserver" Apr 16 17:40:42.175291 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.175262 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 17:40:42.177561 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.177523 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-vfq9g","openshift-multus/network-metrics-daemon-8zmrl","openshift-network-diagnostics/network-check-target-792zb","openshift-network-operator/iptables-alerter-s27g6","openshift-ovn-kubernetes/ovnkube-node-kd892","kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj","openshift-cluster-node-tuning-operator/tuned-4nrxt","openshift-dns/node-resolver-glb6b","openshift-image-registry/node-ca-wzns8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal","openshift-multus/multus-5z7gp","openshift-multus/multus-additional-cni-plugins-ckg4v"] Apr 16 17:40:42.180795 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.180776 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.182352 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.182333 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:42.182492 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.182467 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:40:42.184128 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.183691 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:42.184128 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.183774 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:40:42.184128 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.183793 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.184128 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.183794 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 17:40:42.184128 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.183875 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 17:40:42.184128 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.183990 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 17:40:42.184128 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.183879 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4n2dn\"" Apr 16 17:40:42.185520 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.185500 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.186844 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.186795 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:42.186934 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.186861 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:42.187358 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.187115 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:40:42.187358 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.187330 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 17:40:42.187358 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.187345 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-k5ft7\"" Apr 16 17:40:42.188876 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.188255 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 17:40:42.188876 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.188454 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 17:40:42.188876 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.188486 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 17:40:42.188876 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.188798 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.189907 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.189888 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 17:40:42.190002 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.189933 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 17:40:42.190002 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.189976 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9zc72\"" Apr 16 17:40:42.190133 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.190117 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 17:40:42.190435 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.190411 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 17:40:42.190511 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.190437 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 17:40:42.190567 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.190522 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xm98m\"" Apr 16 17:40:42.190792 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.190775 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.191417 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.191386 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lwjrt\"" Apr 16 17:40:42.191417 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.191406 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:42.191783 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.191770 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:42.192603 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192562 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.192603 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192598 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:42.192811 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192629 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qlzd\" (UniqueName: \"kubernetes.io/projected/ead1084e-aa6f-4c13-8538-b128d209d29d-kube-api-access-8qlzd\") pod \"iptables-alerter-s27g6\" (UID: \"ead1084e-aa6f-4c13-8538-b128d209d29d\") " pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.192811 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192659 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46fwz\" (UniqueName: \"kubernetes.io/projected/9e05546c-6678-483d-95a0-c4a2873b12f7-kube-api-access-46fwz\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.192811 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192724 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t85cg\" (UniqueName: \"kubernetes.io/projected/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-kube-api-access-t85cg\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:42.192811 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192756 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-socket-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.192811 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192784 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ead1084e-aa6f-4c13-8538-b128d209d29d-iptables-alerter-script\") pod \"iptables-alerter-s27g6\" (UID: \"ead1084e-aa6f-4c13-8538-b128d209d29d\") " pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.193060 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192824 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-registration-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.193060 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192856 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-device-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.193060 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192884 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-sys-fs\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.193060 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:42.193060 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192943 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ead1084e-aa6f-4c13-8538-b128d209d29d-host-slash\") pod \"iptables-alerter-s27g6\" (UID: \"ead1084e-aa6f-4c13-8538-b128d209d29d\") " pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.193060 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.192974 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.193411 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.193396 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 17:40:42.193584 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.193572 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 17:40:42.193736 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.193700 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8m9jf\"" Apr 16 17:40:42.196497 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.195116 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.197452 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.197434 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.197804 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.197788 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.198155 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.198138 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 17:40:42.198368 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.198354 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 17:40:42.198368 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.198365 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 17:40:42.198490 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.198429 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 17:40:42.198545 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.198490 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dx7fm\"" Apr 16 17:40:42.200122 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.200106 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 17:40:42.200224 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.200120 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-dqnd6\"" Apr 16 17:40:42.200224 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.200120 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 17:40:42.200474 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.200458 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 17:40:42.200565 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.200547 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l7cw4\"" Apr 16 17:40:42.200621 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.200549 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 17:40:42.201273 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.201254 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 17:40:42.229247 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.229220 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:35:41 +0000 UTC" deadline="2028-01-12 14:27:01.406629568 +0000 UTC" Apr 16 17:40:42.229247 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.229245 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15260h46m19.177386837s" Apr 16 17:40:42.284012 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.283990 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 17:40:42.294039 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294012 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-multus-socket-dir-parent\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.294165 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294044 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-run-multus-certs\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.294165 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294072 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eff4ef24-cd37-4018-b078-2147a941d9e2-agent-certs\") pod \"konnectivity-agent-vfq9g\" (UID: \"eff4ef24-cd37-4018-b078-2147a941d9e2\") " pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:40:42.294165 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-node-log\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.294165 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294122 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-cni-netd\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.294165 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294144 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-host\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.294165 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294159 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-system-cni-dir\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294190 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-socket-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-lib-modules\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294233 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-run-k8s-cni-cncf-io\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eff4ef24-cd37-4018-b078-2147a941d9e2-konnectivity-ca\") pod \"konnectivity-agent-vfq9g\" (UID: \"eff4ef24-cd37-4018-b078-2147a941d9e2\") " pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294316 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b67eedf2-6206-4ed3-9f3d-437023b25e92-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-registration-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294362 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-sys-fs\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294370 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-socket-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:42.294425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294414 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-registration-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ead1084e-aa6f-4c13-8538-b128d209d29d-host-slash\") pod \"iptables-alerter-s27g6\" (UID: \"ead1084e-aa6f-4c13-8538-b128d209d29d\") " pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294487 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-cnibin\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-sys-fs\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294518 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ead1084e-aa6f-4c13-8538-b128d209d29d-host-slash\") pod \"iptables-alerter-s27g6\" (UID: \"ead1084e-aa6f-4c13-8538-b128d209d29d\") " pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294542 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294572 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-var-lib-cni-multus\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294590 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-hostroot\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t85cg\" (UniqueName: \"kubernetes.io/projected/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-kube-api-access-t85cg\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294651 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c8450dc7-532b-4e18-b522-ff27fa85e7be-tmp-dir\") pod \"node-resolver-glb6b\" (UID: \"c8450dc7-532b-4e18-b522-ff27fa85e7be\") " pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-run-netns\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-etc-openvswitch\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-sysconfig\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294768 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-var-lib-kubelet\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294785 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-sysctl-conf\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294799 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-var-lib-kubelet\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.294857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294845 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6nx7\" (UniqueName: \"kubernetes.io/projected/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-kube-api-access-v6nx7\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-etc-kubernetes\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294915 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294938 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-system-cni-dir\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294953 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-systemd-units\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294968 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-kubernetes\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294983 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-systemd\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.294997 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/012ac08b-278c-44fb-8aac-833db15265e1-serviceca\") pod \"node-ca-wzns8\" (UID: \"012ac08b-278c-44fb-8aac-833db15265e1\") " pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295012 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b67eedf2-6206-4ed3-9f3d-437023b25e92-cni-binary-copy\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295026 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b67eedf2-6206-4ed3-9f3d-437023b25e92-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295042 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-var-lib-openvswitch\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-run-openvswitch\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295099 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46fwz\" (UniqueName: \"kubernetes.io/projected/9e05546c-6678-483d-95a0-c4a2873b12f7-kube-api-access-46fwz\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c8450dc7-532b-4e18-b522-ff27fa85e7be-hosts-file\") pod \"node-resolver-glb6b\" (UID: \"c8450dc7-532b-4e18-b522-ff27fa85e7be\") " pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295171 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmzcs\" (UniqueName: \"kubernetes.io/projected/012ac08b-278c-44fb-8aac-833db15265e1-kube-api-access-hmzcs\") pod \"node-ca-wzns8\" (UID: \"012ac08b-278c-44fb-8aac-833db15265e1\") " pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-log-socket\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.295448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92a42226-59e2-448d-8e37-54365cce5c71-ovn-node-metrics-cert\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295289 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/92a42226-59e2-448d-8e37-54365cce5c71-ovnkube-script-lib\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjg57\" (UniqueName: \"kubernetes.io/projected/c8450dc7-532b-4e18-b522-ff27fa85e7be-kube-api-access-sjg57\") pod \"node-resolver-glb6b\" (UID: \"c8450dc7-532b-4e18-b522-ff27fa85e7be\") " pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295381 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-modprobe-d\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295411 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-run\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-multus-cni-dir\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295476 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-multus-conf-dir\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295491 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dd28\" (UniqueName: \"kubernetes.io/projected/4d55587d-f876-4aba-a477-18275926697a-kube-api-access-8dd28\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295535 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ead1084e-aa6f-4c13-8538-b128d209d29d-iptables-alerter-script\") pod \"iptables-alerter-s27g6\" (UID: \"ead1084e-aa6f-4c13-8538-b128d209d29d\") " pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/012ac08b-278c-44fb-8aac-833db15265e1-host\") pod \"node-ca-wzns8\" (UID: \"012ac08b-278c-44fb-8aac-833db15265e1\") " pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295594 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-sys\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295719 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-tuned\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qlzd\" (UniqueName: \"kubernetes.io/projected/ead1084e-aa6f-4c13-8538-b128d209d29d-kube-api-access-8qlzd\") pod \"iptables-alerter-s27g6\" (UID: \"ead1084e-aa6f-4c13-8538-b128d209d29d\") " pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-cnibin\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295910 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-sysctl-d\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.296189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-os-release\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295974 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-run-systemd\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295994 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-cni-bin\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.295994 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296022 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xmzg\" (UniqueName: \"kubernetes.io/projected/b67eedf2-6206-4ed3-9f3d-437023b25e92-kube-api-access-4xmzg\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296047 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92a42226-59e2-448d-8e37-54365cce5c71-ovnkube-config\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296061 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92a42226-59e2-448d-8e37-54365cce5c71-env-overrides\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296076 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrdn8\" (UniqueName: \"kubernetes.io/projected/92a42226-59e2-448d-8e37-54365cce5c71-kube-api-access-hrdn8\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296100 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-tmp\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d55587d-f876-4aba-a477-18275926697a-cni-binary-copy\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296123 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ead1084e-aa6f-4c13-8538-b128d209d29d-iptables-alerter-script\") pod \"iptables-alerter-s27g6\" (UID: \"ead1084e-aa6f-4c13-8538-b128d209d29d\") " pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-run-netns\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296147 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-var-lib-cni-bin\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-device-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-run-ovn\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-run-ovn-kubernetes\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296261 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d55587d-f876-4aba-a477-18275926697a-multus-daemon-config\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.296803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9e05546c-6678-483d-95a0-c4a2873b12f7-device-dir\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.297322 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296287 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:42.297322 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296311 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-kubelet\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.297322 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-slash\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.297322 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.296344 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-os-release\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.297322 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.296396 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:42.297322 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.296467 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs podName:c8eeb2ef-1018-4541-8fb1-d6d55ca7680f nodeName:}" failed. No retries permitted until 2026-04-16 17:40:42.796436929 +0000 UTC m=+3.047851740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs") pod "network-metrics-daemon-8zmrl" (UID: "c8eeb2ef-1018-4541-8fb1-d6d55ca7680f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:42.311901 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.311871 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 17:40:42.314483 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.314462 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:42.314784 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.314488 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:42.314784 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.314503 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gvclr for pod openshift-network-diagnostics/network-check-target-792zb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:42.314784 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.314774 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr podName:d4e56557-2a7a-4842-bf26-96d804cdf01b nodeName:}" failed. No retries permitted until 2026-04-16 17:40:42.814754374 +0000 UTC m=+3.066169194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gvclr" (UniqueName: "kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr") pod "network-check-target-792zb" (UID: "d4e56557-2a7a-4842-bf26-96d804cdf01b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:42.316436 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.316403 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t85cg\" (UniqueName: \"kubernetes.io/projected/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-kube-api-access-t85cg\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:42.316525 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.316421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46fwz\" (UniqueName: \"kubernetes.io/projected/9e05546c-6678-483d-95a0-c4a2873b12f7-kube-api-access-46fwz\") pod \"aws-ebs-csi-driver-node-b8nqj\" (UID: \"9e05546c-6678-483d-95a0-c4a2873b12f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.316906 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.316885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qlzd\" (UniqueName: \"kubernetes.io/projected/ead1084e-aa6f-4c13-8538-b128d209d29d-kube-api-access-8qlzd\") pod \"iptables-alerter-s27g6\" (UID: \"ead1084e-aa6f-4c13-8538-b128d209d29d\") " pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.397556 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397506 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-var-lib-cni-multus\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.397556 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397556 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-hostroot\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.397793 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397585 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c8450dc7-532b-4e18-b522-ff27fa85e7be-tmp-dir\") pod \"node-resolver-glb6b\" (UID: \"c8450dc7-532b-4e18-b522-ff27fa85e7be\") " pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.397793 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-run-netns\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.397793 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397634 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-var-lib-cni-multus\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.397793 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397632 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-etc-openvswitch\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.397793 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397681 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-sysconfig\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.397793 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-hostroot\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.397793 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397726 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-var-lib-kubelet\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.397793 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397737 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-run-netns\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397794 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-etc-openvswitch\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-sysconfig\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-var-lib-kubelet\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-sysctl-conf\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397891 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-var-lib-kubelet\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.397973 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-var-lib-kubelet\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398002 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6nx7\" (UniqueName: \"kubernetes.io/projected/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-kube-api-access-v6nx7\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c8450dc7-532b-4e18-b522-ff27fa85e7be-tmp-dir\") pod \"node-resolver-glb6b\" (UID: \"c8450dc7-532b-4e18-b522-ff27fa85e7be\") " pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398044 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-sysctl-conf\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398040 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-etc-kubernetes\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-system-cni-dir\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-systemd-units\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.398162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-kubernetes\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-etc-kubernetes\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398240 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-systemd\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-systemd\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398272 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/012ac08b-278c-44fb-8aac-833db15265e1-serviceca\") pod \"node-ca-wzns8\" (UID: \"012ac08b-278c-44fb-8aac-833db15265e1\") " pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b67eedf2-6206-4ed3-9f3d-437023b25e92-cni-binary-copy\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398309 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b67eedf2-6206-4ed3-9f3d-437023b25e92-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-var-lib-openvswitch\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-run-openvswitch\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398364 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-systemd-units\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c8450dc7-532b-4e18-b522-ff27fa85e7be-hosts-file\") pod \"node-resolver-glb6b\" (UID: \"c8450dc7-532b-4e18-b522-ff27fa85e7be\") " pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398404 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c8450dc7-532b-4e18-b522-ff27fa85e7be-hosts-file\") pod \"node-resolver-glb6b\" (UID: \"c8450dc7-532b-4e18-b522-ff27fa85e7be\") " pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398407 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-kubernetes\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmzcs\" (UniqueName: \"kubernetes.io/projected/012ac08b-278c-44fb-8aac-833db15265e1-kube-api-access-hmzcs\") pod \"node-ca-wzns8\" (UID: \"012ac08b-278c-44fb-8aac-833db15265e1\") " pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-log-socket\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398460 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-system-cni-dir\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.399074 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398515 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92a42226-59e2-448d-8e37-54365cce5c71-ovn-node-metrics-cert\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/92a42226-59e2-448d-8e37-54365cce5c71-ovnkube-script-lib\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjg57\" (UniqueName: \"kubernetes.io/projected/c8450dc7-532b-4e18-b522-ff27fa85e7be-kube-api-access-sjg57\") pod \"node-resolver-glb6b\" (UID: \"c8450dc7-532b-4e18-b522-ff27fa85e7be\") " pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398599 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-modprobe-d\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-run\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398689 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-multus-cni-dir\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398754 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-multus-conf-dir\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/012ac08b-278c-44fb-8aac-833db15265e1-serviceca\") pod \"node-ca-wzns8\" (UID: \"012ac08b-278c-44fb-8aac-833db15265e1\") " pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398844 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.398857 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-multus-conf-dir\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-log-socket\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399060 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b67eedf2-6206-4ed3-9f3d-437023b25e92-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-run\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399083 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399197 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-modprobe-d\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399227 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dd28\" (UniqueName: \"kubernetes.io/projected/4d55587d-f876-4aba-a477-18275926697a-kube-api-access-8dd28\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399257 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/012ac08b-278c-44fb-8aac-833db15265e1-host\") pod \"node-ca-wzns8\" (UID: \"012ac08b-278c-44fb-8aac-833db15265e1\") " pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-sys\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.399846 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399291 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-tuned\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399214 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-run-openvswitch\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399274 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-var-lib-openvswitch\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399315 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-cnibin\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399418 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-cnibin\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/012ac08b-278c-44fb-8aac-833db15265e1-host\") pod \"node-ca-wzns8\" (UID: \"012ac08b-278c-44fb-8aac-833db15265e1\") " pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-sysctl-d\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399528 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-os-release\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-multus-cni-dir\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399680 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/92a42226-59e2-448d-8e37-54365cce5c71-ovnkube-script-lib\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399690 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-sys\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-os-release\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.399974 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b67eedf2-6206-4ed3-9f3d-437023b25e92-cni-binary-copy\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400186 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-sysctl-d\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-run-systemd\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400315 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-cni-bin\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xmzg\" (UniqueName: \"kubernetes.io/projected/b67eedf2-6206-4ed3-9f3d-437023b25e92-kube-api-access-4xmzg\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400377 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92a42226-59e2-448d-8e37-54365cce5c71-ovnkube-config\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.400475 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400391 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-run-systemd\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.401272 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92a42226-59e2-448d-8e37-54365cce5c71-env-overrides\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.401272 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400432 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrdn8\" (UniqueName: \"kubernetes.io/projected/92a42226-59e2-448d-8e37-54365cce5c71-kube-api-access-hrdn8\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.401272 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-tmp\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.402070 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.402036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92a42226-59e2-448d-8e37-54365cce5c71-env-overrides\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.402439 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.402408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92a42226-59e2-448d-8e37-54365cce5c71-ovnkube-config\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.402971 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.400432 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-cni-bin\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.403177 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-etc-tuned\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.403246 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403222 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92a42226-59e2-448d-8e37-54365cce5c71-ovn-node-metrics-cert\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.403246 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403217 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d55587d-f876-4aba-a477-18275926697a-cni-binary-copy\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403359 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-run-netns\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403413 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403366 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-run-netns\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403413 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403385 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-var-lib-cni-bin\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403522 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-run-ovn\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.403522 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403474 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-run-ovn-kubernetes\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.403621 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403533 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-run-ovn-kubernetes\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.403621 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403543 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-run-ovn\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.403621 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403588 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-var-lib-cni-bin\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403621 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403617 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d55587d-f876-4aba-a477-18275926697a-multus-daemon-config\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403882 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403669 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-kubelet\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.403882 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403729 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-slash\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.403882 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-os-release\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403882 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403788 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-multus-socket-dir-parent\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403882 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403806 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4d55587d-f876-4aba-a477-18275926697a-cni-binary-copy\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403882 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-run-multus-certs\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.403882 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403880 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-slash\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403881 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eff4ef24-cd37-4018-b078-2147a941d9e2-agent-certs\") pod \"konnectivity-agent-vfq9g\" (UID: \"eff4ef24-cd37-4018-b078-2147a941d9e2\") " pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403929 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-node-log\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-cni-netd\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.403969 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-multus-socket-dir-parent\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-host\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404041 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-system-cni-dir\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404044 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-os-release\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-lib-modules\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404082 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4d55587d-f876-4aba-a477-18275926697a-multus-daemon-config\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404110 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-run-multus-certs\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-kubelet\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-run-k8s-cni-cncf-io\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404195 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-system-cni-dir\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.404215 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404214 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-host-cni-netd\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.404864 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eff4ef24-cd37-4018-b078-2147a941d9e2-konnectivity-ca\") pod \"konnectivity-agent-vfq9g\" (UID: \"eff4ef24-cd37-4018-b078-2147a941d9e2\") " pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:40:42.404864 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404272 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4d55587d-f876-4aba-a477-18275926697a-host-run-k8s-cni-cncf-io\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.404864 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b67eedf2-6206-4ed3-9f3d-437023b25e92-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.404864 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404571 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-lib-modules\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.404864 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404790 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-tmp\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.404864 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404813 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b67eedf2-6206-4ed3-9f3d-437023b25e92-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.404864 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/eff4ef24-cd37-4018-b078-2147a941d9e2-konnectivity-ca\") pod \"konnectivity-agent-vfq9g\" (UID: \"eff4ef24-cd37-4018-b078-2147a941d9e2\") " pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:40:42.405435 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404927 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-host\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.405435 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/92a42226-59e2-448d-8e37-54365cce5c71-node-log\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.405435 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.404969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-cnibin\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.405435 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.405014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b67eedf2-6206-4ed3-9f3d-437023b25e92-cnibin\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.407059 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.407039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/eff4ef24-cd37-4018-b078-2147a941d9e2-agent-certs\") pod \"konnectivity-agent-vfq9g\" (UID: \"eff4ef24-cd37-4018-b078-2147a941d9e2\") " pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:40:42.412483 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.412458 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjg57\" (UniqueName: \"kubernetes.io/projected/c8450dc7-532b-4e18-b522-ff27fa85e7be-kube-api-access-sjg57\") pod \"node-resolver-glb6b\" (UID: \"c8450dc7-532b-4e18-b522-ff27fa85e7be\") " pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.412749 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.412728 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6nx7\" (UniqueName: \"kubernetes.io/projected/3ec33ee3-a36f-4904-83ec-95e4c57a5a44-kube-api-access-v6nx7\") pod \"tuned-4nrxt\" (UID: \"3ec33ee3-a36f-4904-83ec-95e4c57a5a44\") " pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.413030 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.413005 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dd28\" (UniqueName: \"kubernetes.io/projected/4d55587d-f876-4aba-a477-18275926697a-kube-api-access-8dd28\") pod \"multus-5z7gp\" (UID: \"4d55587d-f876-4aba-a477-18275926697a\") " pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.413946 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.413925 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xmzg\" (UniqueName: \"kubernetes.io/projected/b67eedf2-6206-4ed3-9f3d-437023b25e92-kube-api-access-4xmzg\") pod \"multus-additional-cni-plugins-ckg4v\" (UID: \"b67eedf2-6206-4ed3-9f3d-437023b25e92\") " pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.414112 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.414096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmzcs\" (UniqueName: \"kubernetes.io/projected/012ac08b-278c-44fb-8aac-833db15265e1-kube-api-access-hmzcs\") pod \"node-ca-wzns8\" (UID: \"012ac08b-278c-44fb-8aac-833db15265e1\") " pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.414845 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.414825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrdn8\" (UniqueName: \"kubernetes.io/projected/92a42226-59e2-448d-8e37-54365cce5c71-kube-api-access-hrdn8\") pod \"ovnkube-node-kd892\" (UID: \"92a42226-59e2-448d-8e37-54365cce5c71\") " pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.493985 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.493893 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" Apr 16 17:40:42.502700 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.502675 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s27g6" Apr 16 17:40:42.511559 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.511537 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:40:42.515403 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.515378 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:42.516441 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.516423 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:40:42.523460 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.523441 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" Apr 16 17:40:42.529050 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.529035 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-glb6b" Apr 16 17:40:42.536520 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.536502 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5z7gp" Apr 16 17:40:42.540687 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.540671 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:42.541686 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.541670 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wzns8" Apr 16 17:40:42.547381 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.547366 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" Apr 16 17:40:42.808178 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.808144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:42.808359 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.808302 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:42.808423 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.808371 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs podName:c8eeb2ef-1018-4541-8fb1-d6d55ca7680f nodeName:}" failed. No retries permitted until 2026-04-16 17:40:43.808353058 +0000 UTC m=+4.059767866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs") pod "network-metrics-daemon-8zmrl" (UID: "c8eeb2ef-1018-4541-8fb1-d6d55ca7680f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:42.909362 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:42.909328 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:42.909538 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.909490 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:42.909538 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.909507 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:42.909538 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.909516 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gvclr for pod openshift-network-diagnostics/network-check-target-792zb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:42.909646 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:42.909587 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr podName:d4e56557-2a7a-4842-bf26-96d804cdf01b nodeName:}" failed. No retries permitted until 2026-04-16 17:40:43.909574257 +0000 UTC m=+4.160989062 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvclr" (UniqueName: "kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr") pod "network-check-target-792zb" (UID: "d4e56557-2a7a-4842-bf26-96d804cdf01b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:42.924972 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:42.924940 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e05546c_6678_483d_95a0_c4a2873b12f7.slice/crio-9a18fef825663dc85f7c7b25e66dbe6cba3284be4321de3e2a299ee388b5e821 WatchSource:0}: Error finding container 9a18fef825663dc85f7c7b25e66dbe6cba3284be4321de3e2a299ee388b5e821: Status 404 returned error can't find the container with id 9a18fef825663dc85f7c7b25e66dbe6cba3284be4321de3e2a299ee388b5e821 Apr 16 17:40:42.926576 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:42.926337 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod012ac08b_278c_44fb_8aac_833db15265e1.slice/crio-13c918626151a1e8b1f3a231e905721c6c0f298b993340047e171342312d1783 WatchSource:0}: Error finding container 13c918626151a1e8b1f3a231e905721c6c0f298b993340047e171342312d1783: Status 404 returned error can't find the container with id 13c918626151a1e8b1f3a231e905721c6c0f298b993340047e171342312d1783 Apr 16 17:40:42.928799 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:42.928770 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8450dc7_532b_4e18_b522_ff27fa85e7be.slice/crio-c1289959f315e8c5c421d0510baaa6d8c9636366dcad595d3193acd7df78b1dd WatchSource:0}: Error finding container c1289959f315e8c5c421d0510baaa6d8c9636366dcad595d3193acd7df78b1dd: Status 404 returned error can't find the container with id c1289959f315e8c5c421d0510baaa6d8c9636366dcad595d3193acd7df78b1dd Apr 16 17:40:42.930875 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:42.930620 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92a42226_59e2_448d_8e37_54365cce5c71.slice/crio-31ce7c7f743ba5df45ba01e18f7652fe0732ab9b5f6106deb24815875aa30d44 WatchSource:0}: Error finding container 31ce7c7f743ba5df45ba01e18f7652fe0732ab9b5f6106deb24815875aa30d44: Status 404 returned error can't find the container with id 31ce7c7f743ba5df45ba01e18f7652fe0732ab9b5f6106deb24815875aa30d44 Apr 16 17:40:42.931732 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:42.931682 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ec33ee3_a36f_4904_83ec_95e4c57a5a44.slice/crio-40101fbc467abbcc376ccb03c5db085955ba8054f37c529d1914848bc6eb8015 WatchSource:0}: Error finding container 40101fbc467abbcc376ccb03c5db085955ba8054f37c529d1914848bc6eb8015: Status 404 returned error can't find the container with id 40101fbc467abbcc376ccb03c5db085955ba8054f37c529d1914848bc6eb8015 Apr 16 17:40:42.935477 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:40:42.935452 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d55587d_f876_4aba_a477_18275926697a.slice/crio-28aeb01cc3d91237ad57d08b0c8f1d3f70d1e8f9dc9b0c41acec30da3008b241 WatchSource:0}: Error finding container 28aeb01cc3d91237ad57d08b0c8f1d3f70d1e8f9dc9b0c41acec30da3008b241: Status 404 returned error can't find the container with id 28aeb01cc3d91237ad57d08b0c8f1d3f70d1e8f9dc9b0c41acec30da3008b241 Apr 16 17:40:43.229773 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.229499 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:35:41 +0000 UTC" deadline="2027-12-24 11:43:19.830392598 +0000 UTC" Apr 16 17:40:43.229773 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.229687 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14802h2m36.600709909s" Apr 16 17:40:43.292720 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.292663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" event={"ID":"3ec33ee3-a36f-4904-83ec-95e4c57a5a44","Type":"ContainerStarted","Data":"40101fbc467abbcc376ccb03c5db085955ba8054f37c529d1914848bc6eb8015"} Apr 16 17:40:43.294620 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.294262 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" event={"ID":"b67eedf2-6206-4ed3-9f3d-437023b25e92","Type":"ContainerStarted","Data":"63381cad55fd5ced18bc997ce5c160ddb2d763a57c240fb8daada5ac0cd25fea"} Apr 16 17:40:43.295554 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.295526 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vfq9g" event={"ID":"eff4ef24-cd37-4018-b078-2147a941d9e2","Type":"ContainerStarted","Data":"78da6ea302eed8bfe351aa96c77a509a18f488018938f152a5988300f7e2eeb2"} Apr 16 17:40:43.296889 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.296862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerStarted","Data":"31ce7c7f743ba5df45ba01e18f7652fe0732ab9b5f6106deb24815875aa30d44"} Apr 16 17:40:43.298036 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.298013 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s27g6" event={"ID":"ead1084e-aa6f-4c13-8538-b128d209d29d","Type":"ContainerStarted","Data":"e657d8e2917742a2f1c623c09bbfba440af6992465b087222b627bd19518d890"} Apr 16 17:40:43.299228 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.299195 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-glb6b" event={"ID":"c8450dc7-532b-4e18-b522-ff27fa85e7be","Type":"ContainerStarted","Data":"c1289959f315e8c5c421d0510baaa6d8c9636366dcad595d3193acd7df78b1dd"} Apr 16 17:40:43.300786 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.300760 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wzns8" event={"ID":"012ac08b-278c-44fb-8aac-833db15265e1","Type":"ContainerStarted","Data":"13c918626151a1e8b1f3a231e905721c6c0f298b993340047e171342312d1783"} Apr 16 17:40:43.303978 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.303955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" event={"ID":"9e05546c-6678-483d-95a0-c4a2873b12f7","Type":"ContainerStarted","Data":"9a18fef825663dc85f7c7b25e66dbe6cba3284be4321de3e2a299ee388b5e821"} Apr 16 17:40:43.306816 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.306089 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal" event={"ID":"69c8b1379aebe31aba3bde3fe6d1e4ea","Type":"ContainerStarted","Data":"6caaaaa58e8de9d6dce1bb2b618a5682114f9e9750ee5f6d6de921218de1dc48"} Apr 16 17:40:43.309428 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.309400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5z7gp" event={"ID":"4d55587d-f876-4aba-a477-18275926697a","Type":"ContainerStarted","Data":"28aeb01cc3d91237ad57d08b0c8f1d3f70d1e8f9dc9b0c41acec30da3008b241"} Apr 16 17:40:43.818019 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.817973 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:43.818219 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:43.818201 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:43.818289 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:43.818278 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs podName:c8eeb2ef-1018-4541-8fb1-d6d55ca7680f nodeName:}" failed. No retries permitted until 2026-04-16 17:40:45.818258059 +0000 UTC m=+6.069672886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs") pod "network-metrics-daemon-8zmrl" (UID: "c8eeb2ef-1018-4541-8fb1-d6d55ca7680f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:43.919759 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:43.919046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:43.919759 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:43.919256 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:43.919759 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:43.919275 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:43.919759 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:43.919287 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gvclr for pod openshift-network-diagnostics/network-check-target-792zb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:43.919759 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:43.919345 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr podName:d4e56557-2a7a-4842-bf26-96d804cdf01b nodeName:}" failed. No retries permitted until 2026-04-16 17:40:45.919326345 +0000 UTC m=+6.170741154 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvclr" (UniqueName: "kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr") pod "network-check-target-792zb" (UID: "d4e56557-2a7a-4842-bf26-96d804cdf01b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:44.285738 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:44.285235 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:44.285738 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:44.285370 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:40:44.286254 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:44.285806 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:44.286254 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:44.285904 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:40:44.324179 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:44.324139 2579 generic.go:358] "Generic (PLEG): container finished" podID="28c03fc8170b3947ca7170efad520626" containerID="966a00fce142f920d7c586425a00658fb6cfe8c5ee375a2e0fce2fe165a4a84e" exitCode=0 Apr 16 17:40:44.325112 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:44.325049 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" event={"ID":"28c03fc8170b3947ca7170efad520626","Type":"ContainerDied","Data":"966a00fce142f920d7c586425a00658fb6cfe8c5ee375a2e0fce2fe165a4a84e"} Apr 16 17:40:44.349569 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:44.349510 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-244.ec2.internal" podStartSLOduration=3.349493157 podStartE2EDuration="3.349493157s" podCreationTimestamp="2026-04-16 17:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:43.322803087 +0000 UTC m=+3.574217914" watchObservedRunningTime="2026-04-16 17:40:44.349493157 +0000 UTC m=+4.600907985" Apr 16 17:40:45.355815 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:45.355768 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" event={"ID":"28c03fc8170b3947ca7170efad520626","Type":"ContainerStarted","Data":"db4e2098d01983f1f39cf5ea8d01ac857c70e694e28ce219c1aac19a47e28b94"} Apr 16 17:40:45.374351 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:45.374264 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-244.ec2.internal" podStartSLOduration=4.374245101 podStartE2EDuration="4.374245101s" podCreationTimestamp="2026-04-16 17:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:45.373729633 +0000 UTC m=+5.625144455" watchObservedRunningTime="2026-04-16 17:40:45.374245101 +0000 UTC m=+5.625659930" Apr 16 17:40:45.834974 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:45.834932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:45.835144 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:45.835095 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:45.835223 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:45.835158 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs podName:c8eeb2ef-1018-4541-8fb1-d6d55ca7680f nodeName:}" failed. No retries permitted until 2026-04-16 17:40:49.835137944 +0000 UTC m=+10.086552762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs") pod "network-metrics-daemon-8zmrl" (UID: "c8eeb2ef-1018-4541-8fb1-d6d55ca7680f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:45.936033 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:45.935995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:45.936199 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:45.936158 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:45.936199 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:45.936182 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:45.936199 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:45.936195 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gvclr for pod openshift-network-diagnostics/network-check-target-792zb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:45.936375 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:45.936254 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr podName:d4e56557-2a7a-4842-bf26-96d804cdf01b nodeName:}" failed. No retries permitted until 2026-04-16 17:40:49.936233818 +0000 UTC m=+10.187648640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvclr" (UniqueName: "kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr") pod "network-check-target-792zb" (UID: "d4e56557-2a7a-4842-bf26-96d804cdf01b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:46.285124 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:46.284835 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:46.285124 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:46.284871 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:46.285124 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:46.284960 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:40:46.285124 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:46.285086 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:40:48.285867 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:48.285834 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:48.286258 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:48.285970 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:40:48.286368 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:48.286289 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:48.286368 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:48.286346 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:40:49.872917 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:49.872883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:49.873377 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:49.873017 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:49.873377 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:49.873076 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs podName:c8eeb2ef-1018-4541-8fb1-d6d55ca7680f nodeName:}" failed. No retries permitted until 2026-04-16 17:40:57.873054954 +0000 UTC m=+18.124469763 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs") pod "network-metrics-daemon-8zmrl" (UID: "c8eeb2ef-1018-4541-8fb1-d6d55ca7680f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:49.973280 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:49.973210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:49.973459 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:49.973349 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:49.973459 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:49.973371 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:49.973459 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:49.973385 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gvclr for pod openshift-network-diagnostics/network-check-target-792zb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:49.973459 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:49.973440 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr podName:d4e56557-2a7a-4842-bf26-96d804cdf01b nodeName:}" failed. No retries permitted until 2026-04-16 17:40:57.973421257 +0000 UTC m=+18.224836067 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvclr" (UniqueName: "kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr") pod "network-check-target-792zb" (UID: "d4e56557-2a7a-4842-bf26-96d804cdf01b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:50.286538 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:50.285960 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:50.286538 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:50.286070 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:40:50.286538 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:50.286429 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:50.286538 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:50.286502 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:40:52.283510 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:52.283480 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:52.283920 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:52.283605 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:40:52.283920 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:52.283648 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:52.283920 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:52.283782 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:40:54.283378 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:54.283341 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:54.283872 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:54.283380 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:54.283872 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:54.283469 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:40:54.283872 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:54.283598 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:40:56.282642 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:56.282603 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:56.283066 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:56.282607 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:56.283066 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:56.282758 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:40:56.283066 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:56.282846 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:40:57.931141 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:57.931106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:57.931660 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:57.931271 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:57.931660 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:57.931345 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs podName:c8eeb2ef-1018-4541-8fb1-d6d55ca7680f nodeName:}" failed. No retries permitted until 2026-04-16 17:41:13.931326162 +0000 UTC m=+34.182740978 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs") pod "network-metrics-daemon-8zmrl" (UID: "c8eeb2ef-1018-4541-8fb1-d6d55ca7680f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:58.032080 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:58.032039 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:58.032261 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:58.032198 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:58.032261 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:58.032220 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:58.032261 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:58.032234 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gvclr for pod openshift-network-diagnostics/network-check-target-792zb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:58.032396 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:58.032290 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr podName:d4e56557-2a7a-4842-bf26-96d804cdf01b nodeName:}" failed. No retries permitted until 2026-04-16 17:41:14.032272775 +0000 UTC m=+34.283687604 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvclr" (UniqueName: "kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr") pod "network-check-target-792zb" (UID: "d4e56557-2a7a-4842-bf26-96d804cdf01b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:58.283129 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:58.283088 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:40:58.283315 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:40:58.283087 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:40:58.283315 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:58.283259 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:40:58.283428 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:40:58.283322 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:00.283608 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:00.283574 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:00.283974 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:00.283650 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:00.283974 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:00.283750 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:00.283974 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:00.283842 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:01.383533 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.383054 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5z7gp" event={"ID":"4d55587d-f876-4aba-a477-18275926697a","Type":"ContainerStarted","Data":"806bf39bc9cd03b4741ef7e176db7c0ee83e71c708375b9a31ca3e560c489932"} Apr 16 17:41:01.384563 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.384535 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" event={"ID":"3ec33ee3-a36f-4904-83ec-95e4c57a5a44","Type":"ContainerStarted","Data":"5b0dd5114e0fd53c98c610ab1b0908c121de2be47a7089ae66773a57d0b94ab0"} Apr 16 17:41:01.386045 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.386018 2579 generic.go:358] "Generic (PLEG): container finished" podID="b67eedf2-6206-4ed3-9f3d-437023b25e92" containerID="d8353d409436fa66a76c194aaaeeba806db401f4ff0d5fcf5a997fe0fde7665d" exitCode=0 Apr 16 17:41:01.386165 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.386083 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" event={"ID":"b67eedf2-6206-4ed3-9f3d-437023b25e92","Type":"ContainerDied","Data":"d8353d409436fa66a76c194aaaeeba806db401f4ff0d5fcf5a997fe0fde7665d"} Apr 16 17:41:01.388658 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.388221 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vfq9g" event={"ID":"eff4ef24-cd37-4018-b078-2147a941d9e2","Type":"ContainerStarted","Data":"3e8ae423a88524dd1097395bd4cf3df81eee4b9ecec4726304186604e9b720b7"} Apr 16 17:41:01.393899 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.393867 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 17:41:01.394254 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.394232 2579 generic.go:358] "Generic (PLEG): container finished" podID="92a42226-59e2-448d-8e37-54365cce5c71" containerID="7870c0f77bbedf27cce8023fda190ba860419843c9f43eba94208372332c5baa" exitCode=1 Apr 16 17:41:01.394337 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.394295 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerStarted","Data":"72bbe38184c22a859cd559470ca24a64c0ae1a4ed178d73621c20a9506d078ef"} Apr 16 17:41:01.394337 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.394328 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerStarted","Data":"a9556d8675d600392b12ab6cdcdef81d0ebb9fe8d5a7dc8e5016efcafd6198cd"} Apr 16 17:41:01.394447 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.394345 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerStarted","Data":"7a13d713c084f77f31beb7bf29b955186975ecd0ff9bc0d265d06fc626f071c0"} Apr 16 17:41:01.394447 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.394358 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerStarted","Data":"5a7f4618780a464302ceae87b9734ac2f344789fffc587afa83893159ee876d4"} Apr 16 17:41:01.394447 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.394373 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerDied","Data":"7870c0f77bbedf27cce8023fda190ba860419843c9f43eba94208372332c5baa"} Apr 16 17:41:01.394447 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.394387 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerStarted","Data":"4cd4705d0287e338f848bade4b2166cdc0ef62c4ca379f620d546b9a70feac63"} Apr 16 17:41:01.395622 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.395592 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-glb6b" event={"ID":"c8450dc7-532b-4e18-b522-ff27fa85e7be","Type":"ContainerStarted","Data":"ac8c520d63dafc27635c729c9b5d9e85b8ec3c57000dea7d57e51196c013b6b1"} Apr 16 17:41:01.397163 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.397142 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wzns8" event={"ID":"012ac08b-278c-44fb-8aac-833db15265e1","Type":"ContainerStarted","Data":"a934fd1d9ed70efb905b5b9eacaaa004c1c78fceb6703bb65a244c1425182b37"} Apr 16 17:41:01.398443 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.398421 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" event={"ID":"9e05546c-6678-483d-95a0-c4a2873b12f7","Type":"ContainerStarted","Data":"c11bb0e6b391cbaf89c5ea5a139e2f42953e36b7e39359b77f955f29e24886b4"} Apr 16 17:41:01.408859 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.408824 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5z7gp" podStartSLOduration=3.767469011 podStartE2EDuration="21.408797669s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:42.937763779 +0000 UTC m=+3.189178596" lastFinishedPulling="2026-04-16 17:41:00.57909244 +0000 UTC m=+20.830507254" observedRunningTime="2026-04-16 17:41:01.408308435 +0000 UTC m=+21.659723260" watchObservedRunningTime="2026-04-16 17:41:01.408797669 +0000 UTC m=+21.660212495" Apr 16 17:41:01.460284 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.460233 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vfq9g" podStartSLOduration=12.137075019 podStartE2EDuration="21.460217445s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:42.939853853 +0000 UTC m=+3.191268664" lastFinishedPulling="2026-04-16 17:40:52.262996283 +0000 UTC m=+12.514411090" observedRunningTime="2026-04-16 17:41:01.459981385 +0000 UTC m=+21.711396211" watchObservedRunningTime="2026-04-16 17:41:01.460217445 +0000 UTC m=+21.711632272" Apr 16 17:41:01.478838 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.478784 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4nrxt" podStartSLOduration=3.835156433 podStartE2EDuration="21.478773473s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:42.934902249 +0000 UTC m=+3.186317052" lastFinishedPulling="2026-04-16 17:41:00.578519285 +0000 UTC m=+20.829934092" observedRunningTime="2026-04-16 17:41:01.477988225 +0000 UTC m=+21.729403052" watchObservedRunningTime="2026-04-16 17:41:01.478773473 +0000 UTC m=+21.730188299" Apr 16 17:41:01.510495 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.510456 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-glb6b" podStartSLOduration=3.890240604 podStartE2EDuration="21.510442392s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:42.931758331 +0000 UTC m=+3.183173152" lastFinishedPulling="2026-04-16 17:41:00.551960123 +0000 UTC m=+20.803374940" observedRunningTime="2026-04-16 17:41:01.494782649 +0000 UTC m=+21.746197474" watchObservedRunningTime="2026-04-16 17:41:01.510442392 +0000 UTC m=+21.761857215" Apr 16 17:41:01.510622 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.510540 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wzns8" podStartSLOduration=4.049119762 podStartE2EDuration="21.510536917s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:42.929397377 +0000 UTC m=+3.180812182" lastFinishedPulling="2026-04-16 17:41:00.390814533 +0000 UTC m=+20.642229337" observedRunningTime="2026-04-16 17:41:01.510384854 +0000 UTC m=+21.761799681" watchObservedRunningTime="2026-04-16 17:41:01.510536917 +0000 UTC m=+21.761951745" Apr 16 17:41:01.952675 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:01.952647 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 17:41:02.255402 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:02.255227 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T17:41:01.952668298Z","UUID":"606d3a20-9117-4444-b828-94cd261ffd5a","Handler":null,"Name":"","Endpoint":""} Apr 16 17:41:02.258407 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:02.258375 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 17:41:02.258538 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:02.258415 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 17:41:02.282595 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:02.282571 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:02.282786 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:02.282571 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:02.282786 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:02.282676 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:02.282786 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:02.282764 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:02.402258 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:02.402219 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s27g6" event={"ID":"ead1084e-aa6f-4c13-8538-b128d209d29d","Type":"ContainerStarted","Data":"e6af0c7fb06bb382610d4858e6cc1a2350f9625cb5623dd839b16405fe793986"} Apr 16 17:41:02.404491 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:02.404458 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" event={"ID":"9e05546c-6678-483d-95a0-c4a2873b12f7","Type":"ContainerStarted","Data":"dc19cdbea8e6cd8478fb668f1a2276e3fa4399c507af93edc53893b822558b3a"} Apr 16 17:41:02.419258 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:02.419204 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s27g6" podStartSLOduration=4.968412345 podStartE2EDuration="22.419189676s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:42.939953911 +0000 UTC m=+3.191368715" lastFinishedPulling="2026-04-16 17:41:00.390731242 +0000 UTC m=+20.642146046" observedRunningTime="2026-04-16 17:41:02.418723892 +0000 UTC m=+22.670138710" watchObservedRunningTime="2026-04-16 17:41:02.419189676 +0000 UTC m=+22.670604502" Apr 16 17:41:03.006807 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:03.006775 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:41:03.007470 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:03.007448 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:41:03.409056 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:03.409020 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" event={"ID":"9e05546c-6678-483d-95a0-c4a2873b12f7","Type":"ContainerStarted","Data":"9ecc4886713a41e12760f8e867ab312d4cf4c71da65095fcfbb0afa28783d762"} Apr 16 17:41:03.430438 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:03.430394 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b8nqj" podStartSLOduration=3.207999562 podStartE2EDuration="23.430379428s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:42.927673939 +0000 UTC m=+3.179088759" lastFinishedPulling="2026-04-16 17:41:03.150053817 +0000 UTC m=+23.401468625" observedRunningTime="2026-04-16 17:41:03.429988679 +0000 UTC m=+23.681403506" watchObservedRunningTime="2026-04-16 17:41:03.430379428 +0000 UTC m=+23.681794254" Apr 16 17:41:04.283299 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:04.283263 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:04.283483 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:04.283386 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:04.283483 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:04.283446 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:04.283582 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:04.283537 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:04.414021 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:04.413993 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 17:41:04.414420 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:04.414382 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerStarted","Data":"05623ba4a26aa2e502d28053f0e0c81cf731cec2f65c105aacbbbeb94a4ead4e"} Apr 16 17:41:04.414420 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:04.414393 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 17:41:04.820532 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:04.820313 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:41:04.820932 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:04.820904 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vfq9g" Apr 16 17:41:06.282667 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.282627 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:06.283589 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.282627 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:06.283589 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:06.282776 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:06.283589 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:06.282850 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:06.421386 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.421224 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 17:41:06.421779 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.421754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerStarted","Data":"60b1b46412d9eb169e5c7081cf779a9ea0503c1ab1903357e757df0036ee3bf4"} Apr 16 17:41:06.422062 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.422037 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:41:06.422062 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.422064 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:41:06.422254 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.422170 2579 scope.go:117] "RemoveContainer" containerID="7870c0f77bbedf27cce8023fda190ba860419843c9f43eba94208372332c5baa" Apr 16 17:41:06.423511 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.423441 2579 generic.go:358] "Generic (PLEG): container finished" podID="b67eedf2-6206-4ed3-9f3d-437023b25e92" containerID="75e51047cf0f363a2a036caffb5fcce414c2223171ef23a68af48ef62bce5424" exitCode=0 Apr 16 17:41:06.423615 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.423567 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" event={"ID":"b67eedf2-6206-4ed3-9f3d-437023b25e92","Type":"ContainerDied","Data":"75e51047cf0f363a2a036caffb5fcce414c2223171ef23a68af48ef62bce5424"} Apr 16 17:41:06.439342 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.439314 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:41:06.439443 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:06.439416 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:41:07.427310 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:07.427222 2579 generic.go:358] "Generic (PLEG): container finished" podID="b67eedf2-6206-4ed3-9f3d-437023b25e92" containerID="99b5187c91ae5ee4bfd377f712943e3eb01b8e472fd85fe2aab26fce06d05256" exitCode=0 Apr 16 17:41:07.427310 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:07.427296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" event={"ID":"b67eedf2-6206-4ed3-9f3d-437023b25e92","Type":"ContainerDied","Data":"99b5187c91ae5ee4bfd377f712943e3eb01b8e472fd85fe2aab26fce06d05256"} Apr 16 17:41:07.430463 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:07.430448 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 17:41:07.430807 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:07.430783 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" event={"ID":"92a42226-59e2-448d-8e37-54365cce5c71","Type":"ContainerStarted","Data":"df79c95b07381a2a346a9a52e484e165b63fcc1af369cdd5142d70a45d7a24c0"} Apr 16 17:41:07.430993 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:07.430983 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:41:07.483852 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:07.483805 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" podStartSLOduration=9.77370945 podStartE2EDuration="27.483791039s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:42.932943626 +0000 UTC m=+3.184358433" lastFinishedPulling="2026-04-16 17:41:00.643025218 +0000 UTC m=+20.894440022" observedRunningTime="2026-04-16 17:41:07.483417835 +0000 UTC m=+27.734832661" watchObservedRunningTime="2026-04-16 17:41:07.483791039 +0000 UTC m=+27.735205899" Apr 16 17:41:08.283451 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:08.283422 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:08.283451 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:08.283448 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:08.283644 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:08.283526 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:08.283644 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:08.283633 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:08.434692 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:08.434658 2579 generic.go:358] "Generic (PLEG): container finished" podID="b67eedf2-6206-4ed3-9f3d-437023b25e92" containerID="3bfd12193a260b6b99b8ec10a1604fd97a1cb17e543d875089ea996d1be3ea4c" exitCode=0 Apr 16 17:41:08.435104 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:08.434750 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" event={"ID":"b67eedf2-6206-4ed3-9f3d-437023b25e92","Type":"ContainerDied","Data":"3bfd12193a260b6b99b8ec10a1604fd97a1cb17e543d875089ea996d1be3ea4c"} Apr 16 17:41:10.283585 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:10.283555 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:10.284028 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:10.283663 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:10.284028 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:10.283905 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:10.284129 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:10.284038 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:12.283072 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:12.283037 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:12.283497 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:12.283208 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:12.283497 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:12.283255 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:12.283497 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:12.283370 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:13.945854 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:13.945816 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:13.946408 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:13.946016 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:13.946408 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:13.946101 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs podName:c8eeb2ef-1018-4541-8fb1-d6d55ca7680f nodeName:}" failed. No retries permitted until 2026-04-16 17:41:45.946077607 +0000 UTC m=+66.197492424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs") pod "network-metrics-daemon-8zmrl" (UID: "c8eeb2ef-1018-4541-8fb1-d6d55ca7680f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:41:14.046968 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:14.046937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:14.047149 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:14.047115 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:41:14.047149 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:14.047139 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:41:14.047259 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:14.047153 2579 projected.go:194] Error preparing data for projected volume kube-api-access-gvclr for pod openshift-network-diagnostics/network-check-target-792zb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:14.047259 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:14.047246 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr podName:d4e56557-2a7a-4842-bf26-96d804cdf01b nodeName:}" failed. No retries permitted until 2026-04-16 17:41:46.047226599 +0000 UTC m=+66.298641406 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-gvclr" (UniqueName: "kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr") pod "network-check-target-792zb" (UID: "d4e56557-2a7a-4842-bf26-96d804cdf01b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:41:14.283535 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:14.283512 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:14.283671 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:14.283641 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:14.283765 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:14.283693 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:14.283820 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:14.283785 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:14.449376 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:14.449199 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" event={"ID":"b67eedf2-6206-4ed3-9f3d-437023b25e92","Type":"ContainerStarted","Data":"91f99a318f142f4fec393d5cf0a2059d4c8c2e5e55cb619a36fda8a164d6a371"} Apr 16 17:41:14.528065 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:14.527992 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-792zb"] Apr 16 17:41:14.528223 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:14.528091 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:14.528223 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:14.528181 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:14.530621 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:14.530597 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8zmrl"] Apr 16 17:41:14.530788 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:14.530677 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:14.530788 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:14.530780 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:15.453519 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:15.453419 2579 generic.go:358] "Generic (PLEG): container finished" podID="b67eedf2-6206-4ed3-9f3d-437023b25e92" containerID="91f99a318f142f4fec393d5cf0a2059d4c8c2e5e55cb619a36fda8a164d6a371" exitCode=0 Apr 16 17:41:15.453894 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:15.453501 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" event={"ID":"b67eedf2-6206-4ed3-9f3d-437023b25e92","Type":"ContainerDied","Data":"91f99a318f142f4fec393d5cf0a2059d4c8c2e5e55cb619a36fda8a164d6a371"} Apr 16 17:41:16.283189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:16.283155 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:16.283366 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:16.283159 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:16.283366 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:16.283255 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:16.283366 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:16.283348 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:16.458207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:16.458174 2579 generic.go:358] "Generic (PLEG): container finished" podID="b67eedf2-6206-4ed3-9f3d-437023b25e92" containerID="1ca5c124eef7d2ccc0b9fe32b81ff2ec6e5fffe60b491d86cfde6d8aaf98deb3" exitCode=0 Apr 16 17:41:16.458207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:16.458203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" event={"ID":"b67eedf2-6206-4ed3-9f3d-437023b25e92","Type":"ContainerDied","Data":"1ca5c124eef7d2ccc0b9fe32b81ff2ec6e5fffe60b491d86cfde6d8aaf98deb3"} Apr 16 17:41:17.465974 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:17.465934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" event={"ID":"b67eedf2-6206-4ed3-9f3d-437023b25e92","Type":"ContainerStarted","Data":"14fe69815bca627d9138d1aded3edc0516b60d110fdcbe0e93f05c9d5dc8bf45"} Apr 16 17:41:17.497788 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:17.497742 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ckg4v" podStartSLOduration=6.165316307 podStartE2EDuration="37.497728528s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:40:42.941780867 +0000 UTC m=+3.193195676" lastFinishedPulling="2026-04-16 17:41:14.274193082 +0000 UTC m=+34.525607897" observedRunningTime="2026-04-16 17:41:17.495725846 +0000 UTC m=+37.747140663" watchObservedRunningTime="2026-04-16 17:41:17.497728528 +0000 UTC m=+37.749143370" Apr 16 17:41:18.282892 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.282861 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:18.283063 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.282863 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:18.283063 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:18.282997 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-792zb" podUID="d4e56557-2a7a-4842-bf26-96d804cdf01b" Apr 16 17:41:18.283063 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:18.283050 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zmrl" podUID="c8eeb2ef-1018-4541-8fb1-d6d55ca7680f" Apr 16 17:41:18.540581 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.540504 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-244.ec2.internal" event="NodeReady" Apr 16 17:41:18.540939 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.540633 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 17:41:18.592725 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.592680 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ck9nk"] Apr 16 17:41:18.601426 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.601401 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dv4hr"] Apr 16 17:41:18.601563 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.601546 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.604612 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.604589 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tjhv5\"" Apr 16 17:41:18.605013 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.604984 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 17:41:18.605100 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.605064 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 17:41:18.617429 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.617410 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ck9nk"] Apr 16 17:41:18.617429 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.617431 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dv4hr"] Apr 16 17:41:18.617579 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.617441 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-d96c2"] Apr 16 17:41:18.617619 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.617588 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.620559 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.620537 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 17:41:18.620678 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.620610 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 17:41:18.620678 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.620628 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bfzld\"" Apr 16 17:41:18.620678 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.620628 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 17:41:18.620956 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.620942 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 17:41:18.632668 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.632643 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d96c2"] Apr 16 17:41:18.632785 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.632773 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d96c2" Apr 16 17:41:18.635746 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.635724 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 17:41:18.635859 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.635747 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 17:41:18.635859 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.635784 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j7r56\"" Apr 16 17:41:18.635945 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.635931 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 17:41:18.683501 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.683473 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e60f91f-0be5-4df7-94f8-f46703367246-config-volume\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.683501 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.683503 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e60f91f-0be5-4df7-94f8-f46703367246-tmp-dir\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.683687 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.683527 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkggg\" (UniqueName: \"kubernetes.io/projected/6e60f91f-0be5-4df7-94f8-f46703367246-kube-api-access-nkggg\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.683687 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.683544 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e60f91f-0be5-4df7-94f8-f46703367246-metrics-tls\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.784837 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.784805 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-data-volume\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.784978 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.784886 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-crio-socket\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.784978 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.784911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e60f91f-0be5-4df7-94f8-f46703367246-config-volume\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.784978 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.784928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e60f91f-0be5-4df7-94f8-f46703367246-tmp-dir\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.784978 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.784950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.785163 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.785028 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krlwt\" (UniqueName: \"kubernetes.io/projected/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-kube-api-access-krlwt\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.785163 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.785080 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkggg\" (UniqueName: \"kubernetes.io/projected/6e60f91f-0be5-4df7-94f8-f46703367246-kube-api-access-nkggg\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.785163 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.785104 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e60f91f-0be5-4df7-94f8-f46703367246-metrics-tls\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.785309 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.785266 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e60f91f-0be5-4df7-94f8-f46703367246-tmp-dir\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.785309 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.785285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.785405 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.785334 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zm4q\" (UniqueName: \"kubernetes.io/projected/c3ed8245-9e98-4c4e-903f-8729ad167125-kube-api-access-4zm4q\") pod \"ingress-canary-d96c2\" (UID: \"c3ed8245-9e98-4c4e-903f-8729ad167125\") " pod="openshift-ingress-canary/ingress-canary-d96c2" Apr 16 17:41:18.785405 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.785360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3ed8245-9e98-4c4e-903f-8729ad167125-cert\") pod \"ingress-canary-d96c2\" (UID: \"c3ed8245-9e98-4c4e-903f-8729ad167125\") " pod="openshift-ingress-canary/ingress-canary-d96c2" Apr 16 17:41:18.785589 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.785569 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e60f91f-0be5-4df7-94f8-f46703367246-config-volume\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.789194 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.789171 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e60f91f-0be5-4df7-94f8-f46703367246-metrics-tls\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.795400 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.795341 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkggg\" (UniqueName: \"kubernetes.io/projected/6e60f91f-0be5-4df7-94f8-f46703367246-kube-api-access-nkggg\") pod \"dns-default-ck9nk\" (UID: \"6e60f91f-0be5-4df7-94f8-f46703367246\") " pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.886529 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.886492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-data-volume\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.886736 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.886546 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-crio-socket\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.886736 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.886671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.886736 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.886728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krlwt\" (UniqueName: \"kubernetes.io/projected/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-kube-api-access-krlwt\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.886736 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.886735 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-crio-socket\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.886955 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.886791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.886955 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.886834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zm4q\" (UniqueName: \"kubernetes.io/projected/c3ed8245-9e98-4c4e-903f-8729ad167125-kube-api-access-4zm4q\") pod \"ingress-canary-d96c2\" (UID: \"c3ed8245-9e98-4c4e-903f-8729ad167125\") " pod="openshift-ingress-canary/ingress-canary-d96c2" Apr 16 17:41:18.886955 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.886864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3ed8245-9e98-4c4e-903f-8729ad167125-cert\") pod \"ingress-canary-d96c2\" (UID: \"c3ed8245-9e98-4c4e-903f-8729ad167125\") " pod="openshift-ingress-canary/ingress-canary-d96c2" Apr 16 17:41:18.886955 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.886932 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-data-volume\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.887339 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.887314 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.889047 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.889022 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.889238 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.889223 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3ed8245-9e98-4c4e-903f-8729ad167125-cert\") pod \"ingress-canary-d96c2\" (UID: \"c3ed8245-9e98-4c4e-903f-8729ad167125\") " pod="openshift-ingress-canary/ingress-canary-d96c2" Apr 16 17:41:18.903388 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.903289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zm4q\" (UniqueName: \"kubernetes.io/projected/c3ed8245-9e98-4c4e-903f-8729ad167125-kube-api-access-4zm4q\") pod \"ingress-canary-d96c2\" (UID: \"c3ed8245-9e98-4c4e-903f-8729ad167125\") " pod="openshift-ingress-canary/ingress-canary-d96c2" Apr 16 17:41:18.904898 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.904873 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krlwt\" (UniqueName: \"kubernetes.io/projected/c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2-kube-api-access-krlwt\") pod \"insights-runtime-extractor-dv4hr\" (UID: \"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2\") " pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.911828 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.911811 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:18.925822 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.925801 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dv4hr" Apr 16 17:41:18.950443 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:18.950412 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d96c2" Apr 16 17:41:19.099161 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:19.099125 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ck9nk"] Apr 16 17:41:19.103152 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:19.103114 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e60f91f_0be5_4df7_94f8_f46703367246.slice/crio-400a401aba3d1fe1dc872ed4083504d5c407ca04df3cafe6ae9c1602b14679f8 WatchSource:0}: Error finding container 400a401aba3d1fe1dc872ed4083504d5c407ca04df3cafe6ae9c1602b14679f8: Status 404 returned error can't find the container with id 400a401aba3d1fe1dc872ed4083504d5c407ca04df3cafe6ae9c1602b14679f8 Apr 16 17:41:19.112572 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:19.112544 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dv4hr"] Apr 16 17:41:19.116437 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:19.116414 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0f29aeb_77e1_4b6f_a1d3_b3ab757c6fc2.slice/crio-9bbe7121abb9533a6b4231c82b6c9555bdb59fd40d6915e140c2a082275890ec WatchSource:0}: Error finding container 9bbe7121abb9533a6b4231c82b6c9555bdb59fd40d6915e140c2a082275890ec: Status 404 returned error can't find the container with id 9bbe7121abb9533a6b4231c82b6c9555bdb59fd40d6915e140c2a082275890ec Apr 16 17:41:19.124808 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:19.124784 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d96c2"] Apr 16 17:41:19.126927 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:19.126908 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3ed8245_9e98_4c4e_903f_8729ad167125.slice/crio-9698e6951eadfacbaa8596a07ad15e226eef0dd66b4eda2f07dd71f98a6f79d6 WatchSource:0}: Error finding container 9698e6951eadfacbaa8596a07ad15e226eef0dd66b4eda2f07dd71f98a6f79d6: Status 404 returned error can't find the container with id 9698e6951eadfacbaa8596a07ad15e226eef0dd66b4eda2f07dd71f98a6f79d6 Apr 16 17:41:19.471332 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:19.471232 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d96c2" event={"ID":"c3ed8245-9e98-4c4e-903f-8729ad167125","Type":"ContainerStarted","Data":"9698e6951eadfacbaa8596a07ad15e226eef0dd66b4eda2f07dd71f98a6f79d6"} Apr 16 17:41:19.472537 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:19.472513 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dv4hr" event={"ID":"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2","Type":"ContainerStarted","Data":"7360c60ab7c5e28452deb54195998ba1c8c32dfbd77bc77e0af28092aecc9cc8"} Apr 16 17:41:19.472672 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:19.472543 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dv4hr" event={"ID":"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2","Type":"ContainerStarted","Data":"9bbe7121abb9533a6b4231c82b6c9555bdb59fd40d6915e140c2a082275890ec"} Apr 16 17:41:19.473564 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:19.473540 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ck9nk" event={"ID":"6e60f91f-0be5-4df7-94f8-f46703367246","Type":"ContainerStarted","Data":"400a401aba3d1fe1dc872ed4083504d5c407ca04df3cafe6ae9c1602b14679f8"} Apr 16 17:41:20.285868 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.285838 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:20.286412 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.285838 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:20.290214 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.290189 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 17:41:20.290214 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.290205 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wmwn6\"" Apr 16 17:41:20.290394 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.290222 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jmq5l\"" Apr 16 17:41:20.290449 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.290394 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:41:20.290494 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.290472 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 17:41:20.896105 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.896074 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-l48nd"] Apr 16 17:41:20.899347 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.899329 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:20.904064 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.903848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-9qf2c\"" Apr 16 17:41:20.904064 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.903860 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 17:41:20.904064 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.903891 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 17:41:20.904064 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.903916 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 17:41:20.904064 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.903860 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 17:41:20.904064 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.903848 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 17:41:20.908126 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:20.908056 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-l48nd"] Apr 16 17:41:21.002235 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.002184 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.002400 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.002253 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-metrics-client-ca\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.002400 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.002285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.002400 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.002360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl22k\" (UniqueName: \"kubernetes.io/projected/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-kube-api-access-sl22k\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.103615 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.102954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl22k\" (UniqueName: \"kubernetes.io/projected/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-kube-api-access-sl22k\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.103615 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.103035 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.103615 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.103087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-metrics-client-ca\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.103615 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.103121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.103615 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:21.103484 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 17:41:21.103615 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:21.103549 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-prometheus-operator-tls podName:63c8e7ee-6b19-4728-9ac0-160f0b1b974f nodeName:}" failed. No retries permitted until 2026-04-16 17:41:21.603526387 +0000 UTC m=+41.854941195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-prometheus-operator-tls") pod "prometheus-operator-78f957474d-l48nd" (UID: "63c8e7ee-6b19-4728-9ac0-160f0b1b974f") : secret "prometheus-operator-tls" not found Apr 16 17:41:21.104788 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.104311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-metrics-client-ca\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.107621 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.107595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.116923 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.114104 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl22k\" (UniqueName: \"kubernetes.io/projected/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-kube-api-access-sl22k\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.481595 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.481488 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dv4hr" event={"ID":"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2","Type":"ContainerStarted","Data":"97d32cc1b4b8dd3d18a2259031aa9e96f22711694ec32cc71f1b40dc10246008"} Apr 16 17:41:21.483096 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.483069 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ck9nk" event={"ID":"6e60f91f-0be5-4df7-94f8-f46703367246","Type":"ContainerStarted","Data":"e1d2a2595aead8e2fbbf0d000e313bbdb3fae5fe0137ee8c2efc6266401c4f86"} Apr 16 17:41:21.483096 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.483104 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ck9nk" event={"ID":"6e60f91f-0be5-4df7-94f8-f46703367246","Type":"ContainerStarted","Data":"1ecb194ea98c6780100fd5e7065e52ffdd614c4b42464aef8f12ac42bf17d202"} Apr 16 17:41:21.483258 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.483205 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:21.484336 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.484307 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d96c2" event={"ID":"c3ed8245-9e98-4c4e-903f-8729ad167125","Type":"ContainerStarted","Data":"7294fad767259ac50825e60badf36ab3a33c1f3ac2fad8afb82ce4e7b5e30861"} Apr 16 17:41:21.504585 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.504541 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ck9nk" podStartSLOduration=1.525859101 podStartE2EDuration="3.504529772s" podCreationTimestamp="2026-04-16 17:41:18 +0000 UTC" firstStartedPulling="2026-04-16 17:41:19.105073848 +0000 UTC m=+39.356488651" lastFinishedPulling="2026-04-16 17:41:21.083744514 +0000 UTC m=+41.335159322" observedRunningTime="2026-04-16 17:41:21.503836639 +0000 UTC m=+41.755251466" watchObservedRunningTime="2026-04-16 17:41:21.504529772 +0000 UTC m=+41.755944597" Apr 16 17:41:21.522090 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.522035 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-d96c2" podStartSLOduration=1.563248775 podStartE2EDuration="3.522018763s" podCreationTimestamp="2026-04-16 17:41:18 +0000 UTC" firstStartedPulling="2026-04-16 17:41:19.128562706 +0000 UTC m=+39.379977511" lastFinishedPulling="2026-04-16 17:41:21.087332695 +0000 UTC m=+41.338747499" observedRunningTime="2026-04-16 17:41:21.521481048 +0000 UTC m=+41.772895891" watchObservedRunningTime="2026-04-16 17:41:21.522018763 +0000 UTC m=+41.773433589" Apr 16 17:41:21.607680 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.607071 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.610017 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.609989 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63c8e7ee-6b19-4728-9ac0-160f0b1b974f-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-l48nd\" (UID: \"63c8e7ee-6b19-4728-9ac0-160f0b1b974f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.810200 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.810167 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" Apr 16 17:41:21.942131 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:21.942095 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-l48nd"] Apr 16 17:41:21.945128 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:21.945099 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63c8e7ee_6b19_4728_9ac0_160f0b1b974f.slice/crio-5912d6f17defb3d8996be02ef2dbf1f338530ce577cce7ed37e827cede813abb WatchSource:0}: Error finding container 5912d6f17defb3d8996be02ef2dbf1f338530ce577cce7ed37e827cede813abb: Status 404 returned error can't find the container with id 5912d6f17defb3d8996be02ef2dbf1f338530ce577cce7ed37e827cede813abb Apr 16 17:41:22.491129 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:22.491100 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" event={"ID":"63c8e7ee-6b19-4728-9ac0-160f0b1b974f","Type":"ContainerStarted","Data":"5912d6f17defb3d8996be02ef2dbf1f338530ce577cce7ed37e827cede813abb"} Apr 16 17:41:23.496127 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:23.496086 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" event={"ID":"63c8e7ee-6b19-4728-9ac0-160f0b1b974f","Type":"ContainerStarted","Data":"d5718ad588d6b54e8854d6de61d69e1ab8a7de57ad231cff86f302893d87279d"} Apr 16 17:41:23.496610 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:23.496135 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" event={"ID":"63c8e7ee-6b19-4728-9ac0-160f0b1b974f","Type":"ContainerStarted","Data":"66ac92e02a0da48f0b96bd05f72b1502f7435e43a212dff483b4c26c0c26b979"} Apr 16 17:41:23.498528 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:23.498500 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dv4hr" event={"ID":"c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2","Type":"ContainerStarted","Data":"89381412463b7f06714282ca9786170da927e85e706b7a743658076b846f3b7b"} Apr 16 17:41:23.517678 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:23.517633 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-l48nd" podStartSLOduration=2.181833566 podStartE2EDuration="3.517621345s" podCreationTimestamp="2026-04-16 17:41:20 +0000 UTC" firstStartedPulling="2026-04-16 17:41:21.947569221 +0000 UTC m=+42.198984038" lastFinishedPulling="2026-04-16 17:41:23.283356999 +0000 UTC m=+43.534771817" observedRunningTime="2026-04-16 17:41:23.517010479 +0000 UTC m=+43.768425305" watchObservedRunningTime="2026-04-16 17:41:23.517621345 +0000 UTC m=+43.769036187" Apr 16 17:41:23.540163 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:23.540069 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dv4hr" podStartSLOduration=2.312974647 podStartE2EDuration="5.540054802s" podCreationTimestamp="2026-04-16 17:41:18 +0000 UTC" firstStartedPulling="2026-04-16 17:41:19.251973725 +0000 UTC m=+39.503388529" lastFinishedPulling="2026-04-16 17:41:22.479053877 +0000 UTC m=+42.730468684" observedRunningTime="2026-04-16 17:41:23.539722863 +0000 UTC m=+43.791137681" watchObservedRunningTime="2026-04-16 17:41:23.540054802 +0000 UTC m=+43.791469628" Apr 16 17:41:25.356235 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.356203 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dd897"] Apr 16 17:41:25.359818 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.359795 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.363509 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.363481 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 17:41:25.363622 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.363524 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-w7bpg\"" Apr 16 17:41:25.363842 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.363825 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 17:41:25.363842 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.363838 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 17:41:25.537341 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.537307 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.537511 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.537349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-tls\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.537511 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.537373 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6da03822-75a1-421e-be29-b8b96eba8b6b-metrics-client-ca\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.537511 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.537405 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6da03822-75a1-421e-be29-b8b96eba8b6b-sys\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.537511 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.537455 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdf2w\" (UniqueName: \"kubernetes.io/projected/6da03822-75a1-421e-be29-b8b96eba8b6b-kube-api-access-cdf2w\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.537751 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.537530 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-wtmp\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.537751 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.537561 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6da03822-75a1-421e-be29-b8b96eba8b6b-root\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.537751 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.537586 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-textfile\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.537751 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.537616 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-accelerators-collector-config\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638519 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638441 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdf2w\" (UniqueName: \"kubernetes.io/projected/6da03822-75a1-421e-be29-b8b96eba8b6b-kube-api-access-cdf2w\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638652 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-wtmp\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638652 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6da03822-75a1-421e-be29-b8b96eba8b6b-root\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638652 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-textfile\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638652 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638618 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-accelerators-collector-config\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638870 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6da03822-75a1-421e-be29-b8b96eba8b6b-root\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638870 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638661 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638870 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-tls\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638870 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638741 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-wtmp\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638870 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6da03822-75a1-421e-be29-b8b96eba8b6b-metrics-client-ca\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638870 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6da03822-75a1-421e-be29-b8b96eba8b6b-sys\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.638870 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:25.638848 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 17:41:25.639156 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.638874 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6da03822-75a1-421e-be29-b8b96eba8b6b-sys\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.639156 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:25.638907 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-tls podName:6da03822-75a1-421e-be29-b8b96eba8b6b nodeName:}" failed. No retries permitted until 2026-04-16 17:41:26.13888689 +0000 UTC m=+46.390301694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-tls") pod "node-exporter-dd897" (UID: "6da03822-75a1-421e-be29-b8b96eba8b6b") : secret "node-exporter-tls" not found Apr 16 17:41:25.639156 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.639027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-textfile\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.639251 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.639236 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-accelerators-collector-config\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.649496 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.649476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.649602 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.649505 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6da03822-75a1-421e-be29-b8b96eba8b6b-metrics-client-ca\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:25.651491 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:25.651468 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdf2w\" (UniqueName: \"kubernetes.io/projected/6da03822-75a1-421e-be29-b8b96eba8b6b-kube-api-access-cdf2w\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:26.142956 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:26.142921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-tls\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:26.145372 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:26.145352 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6da03822-75a1-421e-be29-b8b96eba8b6b-node-exporter-tls\") pod \"node-exporter-dd897\" (UID: \"6da03822-75a1-421e-be29-b8b96eba8b6b\") " pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:26.268432 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:26.268403 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dd897" Apr 16 17:41:26.276870 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:26.276838 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6da03822_75a1_421e_be29_b8b96eba8b6b.slice/crio-bfd81f68dd99213942399ac960db5add66952972e84bcf4d7b01e544480b694b WatchSource:0}: Error finding container bfd81f68dd99213942399ac960db5add66952972e84bcf4d7b01e544480b694b: Status 404 returned error can't find the container with id bfd81f68dd99213942399ac960db5add66952972e84bcf4d7b01e544480b694b Apr 16 17:41:26.507035 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:26.506955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dd897" event={"ID":"6da03822-75a1-421e-be29-b8b96eba8b6b","Type":"ContainerStarted","Data":"bfd81f68dd99213942399ac960db5add66952972e84bcf4d7b01e544480b694b"} Apr 16 17:41:28.422183 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.422149 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm"] Apr 16 17:41:28.455513 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.455483 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm"] Apr 16 17:41:28.455667 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.455658 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.457755 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.457722 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.457877 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.457781 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42c552ad-3526-4acd-8d4a-835ad2b66320-metrics-client-ca\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.457877 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.457817 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-grpc-tls\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.457877 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.457848 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.458034 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.457932 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.458034 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.458014 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.458104 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.458046 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqqgs\" (UniqueName: \"kubernetes.io/projected/42c552ad-3526-4acd-8d4a-835ad2b66320-kube-api-access-bqqgs\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.458104 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.458080 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-tls\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.458626 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.458584 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 17:41:28.458626 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.458624 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 17:41:28.458822 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.458649 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-9tq6j00eao7c7\"" Apr 16 17:41:28.458822 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.458671 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 17:41:28.458979 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.458960 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 17:41:28.459082 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.459057 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 17:41:28.459147 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.459134 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-gs4zv\"" Apr 16 17:41:28.516241 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.516208 2579 generic.go:358] "Generic (PLEG): container finished" podID="6da03822-75a1-421e-be29-b8b96eba8b6b" containerID="a413510566da36773696b91a85ce7f1feb94380c1977bd0f777994ee57993278" exitCode=0 Apr 16 17:41:28.516387 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.516265 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dd897" event={"ID":"6da03822-75a1-421e-be29-b8b96eba8b6b","Type":"ContainerDied","Data":"a413510566da36773696b91a85ce7f1feb94380c1977bd0f777994ee57993278"} Apr 16 17:41:28.559028 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.559001 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.559103 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.559064 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42c552ad-3526-4acd-8d4a-835ad2b66320-metrics-client-ca\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.559103 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.559097 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-grpc-tls\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.559209 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.559125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.559260 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.559208 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.559300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.559265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.559300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.559292 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqqgs\" (UniqueName: \"kubernetes.io/projected/42c552ad-3526-4acd-8d4a-835ad2b66320-kube-api-access-bqqgs\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.559377 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.559340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-tls\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.560342 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.560027 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42c552ad-3526-4acd-8d4a-835ad2b66320-metrics-client-ca\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.562102 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.562075 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.562469 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.562443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-tls\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.562578 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.562552 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.563026 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.563000 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.563318 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.563298 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-grpc-tls\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.563417 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.563400 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/42c552ad-3526-4acd-8d4a-835ad2b66320-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.569186 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.569170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqqgs\" (UniqueName: \"kubernetes.io/projected/42c552ad-3526-4acd-8d4a-835ad2b66320-kube-api-access-bqqgs\") pod \"thanos-querier-5d4b74bc65-qcwnm\" (UID: \"42c552ad-3526-4acd-8d4a-835ad2b66320\") " pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.765277 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.765252 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:28.894468 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:28.894443 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm"] Apr 16 17:41:28.896868 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:28.896840 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c552ad_3526_4acd_8d4a_835ad2b66320.slice/crio-d26ab03166ffb5c7b13fd3bd9cc32baed7651da52b98e790184d1f77455f5e1a WatchSource:0}: Error finding container d26ab03166ffb5c7b13fd3bd9cc32baed7651da52b98e790184d1f77455f5e1a: Status 404 returned error can't find the container with id d26ab03166ffb5c7b13fd3bd9cc32baed7651da52b98e790184d1f77455f5e1a Apr 16 17:41:29.522017 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.521976 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dd897" event={"ID":"6da03822-75a1-421e-be29-b8b96eba8b6b","Type":"ContainerStarted","Data":"3b65c1b17822545d296fbfc3d751d56f865d99cdd789a9529121ffbb5f7f253a"} Apr 16 17:41:29.522017 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.522012 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dd897" event={"ID":"6da03822-75a1-421e-be29-b8b96eba8b6b","Type":"ContainerStarted","Data":"1fe53c2f35c962515f933773068829f27ecd81eb0895afebef4fbda7b6de5b55"} Apr 16 17:41:29.523022 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.523002 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" event={"ID":"42c552ad-3526-4acd-8d4a-835ad2b66320","Type":"ContainerStarted","Data":"d26ab03166ffb5c7b13fd3bd9cc32baed7651da52b98e790184d1f77455f5e1a"} Apr 16 17:41:29.546166 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.546117 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dd897" podStartSLOduration=3.412938486 podStartE2EDuration="4.546102328s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:26.27901004 +0000 UTC m=+46.530424859" lastFinishedPulling="2026-04-16 17:41:27.412173883 +0000 UTC m=+47.663588701" observedRunningTime="2026-04-16 17:41:29.544105466 +0000 UTC m=+49.795520294" watchObservedRunningTime="2026-04-16 17:41:29.546102328 +0000 UTC m=+49.797517155" Apr 16 17:41:29.714435 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.714400 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-794c6668b-clrf9"] Apr 16 17:41:29.727249 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.726520 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.727249 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.727187 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-794c6668b-clrf9"] Apr 16 17:41:29.730598 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.730570 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-cjevaedqvnaa9\"" Apr 16 17:41:29.735325 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.735281 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 17:41:29.735325 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.735311 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 17:41:29.735533 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.735367 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-p6hqk\"" Apr 16 17:41:29.735533 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.735369 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 17:41:29.735980 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.735957 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 17:41:29.769384 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.769349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1621992d-f0ff-463a-bc83-c10aa76a4028-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.769542 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.769432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1621992d-f0ff-463a-bc83-c10aa76a4028-client-ca-bundle\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.769542 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.769469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1621992d-f0ff-463a-bc83-c10aa76a4028-audit-log\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.769542 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.769493 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1621992d-f0ff-463a-bc83-c10aa76a4028-secret-metrics-server-tls\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.769719 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.769543 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1621992d-f0ff-463a-bc83-c10aa76a4028-secret-metrics-server-client-certs\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.769719 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.769619 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1621992d-f0ff-463a-bc83-c10aa76a4028-metrics-server-audit-profiles\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.769719 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.769668 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8p4j\" (UniqueName: \"kubernetes.io/projected/1621992d-f0ff-463a-bc83-c10aa76a4028-kube-api-access-b8p4j\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.870400 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.870311 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1621992d-f0ff-463a-bc83-c10aa76a4028-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.870565 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.870410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1621992d-f0ff-463a-bc83-c10aa76a4028-client-ca-bundle\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.870565 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.870475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1621992d-f0ff-463a-bc83-c10aa76a4028-audit-log\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.870565 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.870504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1621992d-f0ff-463a-bc83-c10aa76a4028-secret-metrics-server-tls\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.870565 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.870532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1621992d-f0ff-463a-bc83-c10aa76a4028-secret-metrics-server-client-certs\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.870785 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.870624 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1621992d-f0ff-463a-bc83-c10aa76a4028-metrics-server-audit-profiles\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.870785 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.870671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8p4j\" (UniqueName: \"kubernetes.io/projected/1621992d-f0ff-463a-bc83-c10aa76a4028-kube-api-access-b8p4j\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.871169 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.871136 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1621992d-f0ff-463a-bc83-c10aa76a4028-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.871454 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.871433 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1621992d-f0ff-463a-bc83-c10aa76a4028-audit-log\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.871819 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.871793 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1621992d-f0ff-463a-bc83-c10aa76a4028-metrics-server-audit-profiles\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.873836 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.873815 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1621992d-f0ff-463a-bc83-c10aa76a4028-client-ca-bundle\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.874097 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.874079 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1621992d-f0ff-463a-bc83-c10aa76a4028-secret-metrics-server-tls\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.878189 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.878165 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1621992d-f0ff-463a-bc83-c10aa76a4028-secret-metrics-server-client-certs\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:29.881635 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:29.881615 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8p4j\" (UniqueName: \"kubernetes.io/projected/1621992d-f0ff-463a-bc83-c10aa76a4028-kube-api-access-b8p4j\") pod \"metrics-server-794c6668b-clrf9\" (UID: \"1621992d-f0ff-463a-bc83-c10aa76a4028\") " pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:30.042098 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.042065 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:30.077251 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.077221 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k"] Apr 16 17:41:30.094479 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.094442 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k"] Apr 16 17:41:30.094626 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.094554 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" Apr 16 17:41:30.097755 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.097504 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-g66nd\"" Apr 16 17:41:30.097755 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.097532 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 17:41:30.173433 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.173404 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ac1a402-aba7-4034-b952-df5f690dece5-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9bt6k\" (UID: \"0ac1a402-aba7-4034-b952-df5f690dece5\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" Apr 16 17:41:30.190811 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.190783 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-794c6668b-clrf9"] Apr 16 17:41:30.273958 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.273920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ac1a402-aba7-4034-b952-df5f690dece5-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9bt6k\" (UID: \"0ac1a402-aba7-4034-b952-df5f690dece5\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" Apr 16 17:41:30.274127 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:30.274078 2579 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 17:41:30.274174 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:41:30.274144 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ac1a402-aba7-4034-b952-df5f690dece5-monitoring-plugin-cert podName:0ac1a402-aba7-4034-b952-df5f690dece5 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:30.774128421 +0000 UTC m=+51.025543224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/0ac1a402-aba7-4034-b952-df5f690dece5-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-9bt6k" (UID: "0ac1a402-aba7-4034-b952-df5f690dece5") : secret "monitoring-plugin-cert" not found Apr 16 17:41:30.771586 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:30.771530 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1621992d_f0ff_463a_bc83_c10aa76a4028.slice/crio-f4f7ebd48b19a6e86dc341d11d849118f363456b117d094dfa724052781fd766 WatchSource:0}: Error finding container f4f7ebd48b19a6e86dc341d11d849118f363456b117d094dfa724052781fd766: Status 404 returned error can't find the container with id f4f7ebd48b19a6e86dc341d11d849118f363456b117d094dfa724052781fd766 Apr 16 17:41:30.778063 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.778042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ac1a402-aba7-4034-b952-df5f690dece5-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9bt6k\" (UID: \"0ac1a402-aba7-4034-b952-df5f690dece5\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" Apr 16 17:41:30.780736 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:30.780692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0ac1a402-aba7-4034-b952-df5f690dece5-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-9bt6k\" (UID: \"0ac1a402-aba7-4034-b952-df5f690dece5\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" Apr 16 17:41:31.009740 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.009688 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" Apr 16 17:41:31.242492 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.242357 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k"] Apr 16 17:41:31.255225 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:31.255199 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ac1a402_aba7_4034_b952_df5f690dece5.slice/crio-de35f84c26b642d7d23ea0de9be3f60f4a8a884c5af02eda4c48033772258a85 WatchSource:0}: Error finding container de35f84c26b642d7d23ea0de9be3f60f4a8a884c5af02eda4c48033772258a85: Status 404 returned error can't find the container with id de35f84c26b642d7d23ea0de9be3f60f4a8a884c5af02eda4c48033772258a85 Apr 16 17:41:31.493141 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.493118 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ck9nk" Apr 16 17:41:31.547979 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.547700 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" event={"ID":"0ac1a402-aba7-4034-b952-df5f690dece5","Type":"ContainerStarted","Data":"de35f84c26b642d7d23ea0de9be3f60f4a8a884c5af02eda4c48033772258a85"} Apr 16 17:41:31.550113 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.549825 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" event={"ID":"42c552ad-3526-4acd-8d4a-835ad2b66320","Type":"ContainerStarted","Data":"6f59f2dc0ca43f486797d8458ceeedc16e9cc1c9d5c45481b4e12ca1f5ad1e41"} Apr 16 17:41:31.550113 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.549861 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" event={"ID":"42c552ad-3526-4acd-8d4a-835ad2b66320","Type":"ContainerStarted","Data":"e5520168220816bbb757425424c2cdb60a902b6e38fb560e4ec05cbaceb6536f"} Apr 16 17:41:31.550113 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.549874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" event={"ID":"42c552ad-3526-4acd-8d4a-835ad2b66320","Type":"ContainerStarted","Data":"3d3648a5624b83f969b90855d64d9340b9c7b98db125788d11d674cb6150f0bf"} Apr 16 17:41:31.552259 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.552226 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-794c6668b-clrf9" event={"ID":"1621992d-f0ff-463a-bc83-c10aa76a4028","Type":"ContainerStarted","Data":"f4f7ebd48b19a6e86dc341d11d849118f363456b117d094dfa724052781fd766"} Apr 16 17:41:31.616821 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.616779 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:41:31.630448 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.630422 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.634900 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.634837 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 17:41:31.645860 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.642774 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 17:41:31.645860 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.643105 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 17:41:31.645860 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.643301 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 17:41:31.645860 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.644100 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 17:41:31.645860 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.644388 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 17:41:31.645860 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.644787 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 17:41:31.645860 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.645031 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 17:41:31.645860 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.645515 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-6vzqw\"" Apr 16 17:41:31.645860 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.645694 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2v42kiogloaq\"" Apr 16 17:41:31.646383 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.646271 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 17:41:31.646607 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.646536 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 17:41:31.647964 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.647937 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 17:41:31.653068 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.652990 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:41:31.660192 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.659927 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 17:41:31.685904 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.685655 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.685904 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.685778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config-out\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.685904 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.685823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.685904 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.685860 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.685904 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.685889 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.685923 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686010 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686040 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-web-config\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686146 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686167 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686181 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cpgz\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-kube-api-access-6cpgz\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686213 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686279 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.686300 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.686299 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.787566 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787575 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787601 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-web-config\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787640 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787790 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787816 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cpgz\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-kube-api-access-6cpgz\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787845 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787871 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787930 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788014 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.787998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788582 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.788028 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788582 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.788074 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config-out\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788582 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.788101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788582 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.788138 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788582 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.788168 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.788831 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.788586 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.789541 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.789516 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.791963 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.791857 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.791963 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.791907 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.791963 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.791908 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.792412 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config-out\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.792504 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.793821 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.794408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.796112 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.796406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.796700 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.797416 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.797569 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.798025 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.799343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-web-config\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.801872 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.810940 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.802406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cpgz\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-kube-api-access-6cpgz\") pod \"prometheus-k8s-0\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:31.942567 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:31.942485 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:33.608904 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:33.608869 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:41:33.610798 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:33.610774 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1008fa2f_0c0b_481b_8f45_85472dcf8acf.slice/crio-731b8a79b9e333744715c39f6eae4aaf9d7b6688285650dddad8daa858a899aa WatchSource:0}: Error finding container 731b8a79b9e333744715c39f6eae4aaf9d7b6688285650dddad8daa858a899aa: Status 404 returned error can't find the container with id 731b8a79b9e333744715c39f6eae4aaf9d7b6688285650dddad8daa858a899aa Apr 16 17:41:34.562858 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.562821 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" event={"ID":"0ac1a402-aba7-4034-b952-df5f690dece5","Type":"ContainerStarted","Data":"6a9b8f92278b5cd81ba0ef667c200879738f3294505d902e12e6270e9049de8b"} Apr 16 17:41:34.563058 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.563018 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" Apr 16 17:41:34.566062 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.565993 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" event={"ID":"42c552ad-3526-4acd-8d4a-835ad2b66320","Type":"ContainerStarted","Data":"240ae51c4c88b43a899dc287a549739da45af649467de538ea6d43e8dae76f6a"} Apr 16 17:41:34.566062 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.566028 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" event={"ID":"42c552ad-3526-4acd-8d4a-835ad2b66320","Type":"ContainerStarted","Data":"fe0a90f8b06f95812cdd5e89548f32d5b482cf030c43e60d1d2bdf5e06ed7e5a"} Apr 16 17:41:34.566062 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.566043 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" event={"ID":"42c552ad-3526-4acd-8d4a-835ad2b66320","Type":"ContainerStarted","Data":"7173d740e09bfc70b743efef4631f1dade27818a28a08a47edbec799d4726e3f"} Apr 16 17:41:34.566262 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.566199 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:34.567754 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.567693 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-794c6668b-clrf9" event={"ID":"1621992d-f0ff-463a-bc83-c10aa76a4028","Type":"ContainerStarted","Data":"f1e1f9a0378f64b8f188ba49045d32453d0e6e18be9ff48fa38cace6e26e14cd"} Apr 16 17:41:34.569125 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.569101 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerStarted","Data":"731b8a79b9e333744715c39f6eae4aaf9d7b6688285650dddad8daa858a899aa"} Apr 16 17:41:34.569666 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.569649 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" Apr 16 17:41:34.582197 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.582156 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-9bt6k" podStartSLOduration=2.366931105 podStartE2EDuration="4.58214155s" podCreationTimestamp="2026-04-16 17:41:30 +0000 UTC" firstStartedPulling="2026-04-16 17:41:31.257025981 +0000 UTC m=+51.508440786" lastFinishedPulling="2026-04-16 17:41:33.472236423 +0000 UTC m=+53.723651231" observedRunningTime="2026-04-16 17:41:34.581110108 +0000 UTC m=+54.832524938" watchObservedRunningTime="2026-04-16 17:41:34.58214155 +0000 UTC m=+54.833556378" Apr 16 17:41:34.622830 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.622775 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-794c6668b-clrf9" podStartSLOduration=2.923999953 podStartE2EDuration="5.622759349s" podCreationTimestamp="2026-04-16 17:41:29 +0000 UTC" firstStartedPulling="2026-04-16 17:41:30.773475293 +0000 UTC m=+51.024890109" lastFinishedPulling="2026-04-16 17:41:33.472234694 +0000 UTC m=+53.723649505" observedRunningTime="2026-04-16 17:41:34.621903252 +0000 UTC m=+54.873318078" watchObservedRunningTime="2026-04-16 17:41:34.622759349 +0000 UTC m=+54.874174176" Apr 16 17:41:34.650079 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:34.650030 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" podStartSLOduration=1.7453162020000001 podStartE2EDuration="6.650016509s" podCreationTimestamp="2026-04-16 17:41:28 +0000 UTC" firstStartedPulling="2026-04-16 17:41:28.898972778 +0000 UTC m=+49.150387583" lastFinishedPulling="2026-04-16 17:41:33.803673083 +0000 UTC m=+54.055087890" observedRunningTime="2026-04-16 17:41:34.648902126 +0000 UTC m=+54.900316954" watchObservedRunningTime="2026-04-16 17:41:34.650016509 +0000 UTC m=+54.901431334" Apr 16 17:41:35.573568 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:35.573539 2579 generic.go:358] "Generic (PLEG): container finished" podID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerID="b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24" exitCode=0 Apr 16 17:41:35.573769 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:35.573652 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerDied","Data":"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24"} Apr 16 17:41:38.446620 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:38.446593 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kd892" Apr 16 17:41:38.589803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:38.589286 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerStarted","Data":"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d"} Apr 16 17:41:38.589803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:38.589329 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerStarted","Data":"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334"} Apr 16 17:41:38.589803 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:38.589343 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerStarted","Data":"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5"} Apr 16 17:41:39.595501 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:39.595467 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerStarted","Data":"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7"} Apr 16 17:41:39.595501 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:39.595503 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerStarted","Data":"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9"} Apr 16 17:41:39.595920 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:39.595512 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerStarted","Data":"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47"} Apr 16 17:41:39.627544 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:39.627497 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.889210385 podStartE2EDuration="8.627480908s" podCreationTimestamp="2026-04-16 17:41:31 +0000 UTC" firstStartedPulling="2026-04-16 17:41:33.613450452 +0000 UTC m=+53.864865256" lastFinishedPulling="2026-04-16 17:41:38.351720975 +0000 UTC m=+58.603135779" observedRunningTime="2026-04-16 17:41:39.625699725 +0000 UTC m=+59.877114551" watchObservedRunningTime="2026-04-16 17:41:39.627480908 +0000 UTC m=+59.878895735" Apr 16 17:41:40.580361 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:40.580331 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5d4b74bc65-qcwnm" Apr 16 17:41:41.943162 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:41.943120 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:46.028005 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.027966 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:46.031662 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.031639 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:41:46.040959 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.040936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c8eeb2ef-1018-4541-8fb1-d6d55ca7680f-metrics-certs\") pod \"network-metrics-daemon-8zmrl\" (UID: \"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f\") " pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:46.113307 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.113285 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jmq5l\"" Apr 16 17:41:46.121851 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.121831 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zmrl" Apr 16 17:41:46.128787 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.128758 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:46.132191 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.132173 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 17:41:46.142770 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.142747 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 17:41:46.153274 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.153252 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvclr\" (UniqueName: \"kubernetes.io/projected/d4e56557-2a7a-4842-bf26-96d804cdf01b-kube-api-access-gvclr\") pod \"network-check-target-792zb\" (UID: \"d4e56557-2a7a-4842-bf26-96d804cdf01b\") " pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:46.268163 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.268131 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8zmrl"] Apr 16 17:41:46.271007 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:46.270981 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8eeb2ef_1018_4541_8fb1_d6d55ca7680f.slice/crio-9a553ff246c549d404f6e4e2ecd13bd33da125032c7871606ce6301962f1c48e WatchSource:0}: Error finding container 9a553ff246c549d404f6e4e2ecd13bd33da125032c7871606ce6301962f1c48e: Status 404 returned error can't find the container with id 9a553ff246c549d404f6e4e2ecd13bd33da125032c7871606ce6301962f1c48e Apr 16 17:41:46.408154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.408071 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wmwn6\"" Apr 16 17:41:46.416054 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.416034 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:46.538344 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.538317 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-792zb"] Apr 16 17:41:46.540449 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:41:46.540418 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4e56557_2a7a_4842_bf26_96d804cdf01b.slice/crio-a0cf1e565d54fb8f810ecfb16dce1da129b12cfaf73be6a8a99d8a107dce3bba WatchSource:0}: Error finding container a0cf1e565d54fb8f810ecfb16dce1da129b12cfaf73be6a8a99d8a107dce3bba: Status 404 returned error can't find the container with id a0cf1e565d54fb8f810ecfb16dce1da129b12cfaf73be6a8a99d8a107dce3bba Apr 16 17:41:46.619896 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.619850 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-792zb" event={"ID":"d4e56557-2a7a-4842-bf26-96d804cdf01b","Type":"ContainerStarted","Data":"a0cf1e565d54fb8f810ecfb16dce1da129b12cfaf73be6a8a99d8a107dce3bba"} Apr 16 17:41:46.621060 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:46.621027 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8zmrl" event={"ID":"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f","Type":"ContainerStarted","Data":"9a553ff246c549d404f6e4e2ecd13bd33da125032c7871606ce6301962f1c48e"} Apr 16 17:41:47.626792 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:47.626754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8zmrl" event={"ID":"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f","Type":"ContainerStarted","Data":"887a3048af925fcd68b9a97608f3beefe6bc45132202b91c02d2c4a54200648f"} Apr 16 17:41:47.626792 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:47.626799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8zmrl" event={"ID":"c8eeb2ef-1018-4541-8fb1-d6d55ca7680f","Type":"ContainerStarted","Data":"f547c0f441c48ecc866fa3244f26249e0d92607e5960032360af28db00b26e2f"} Apr 16 17:41:49.635484 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:49.635389 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-792zb" event={"ID":"d4e56557-2a7a-4842-bf26-96d804cdf01b","Type":"ContainerStarted","Data":"cb61d476d197631854f44a8e07d4e40b89ee59404426e370f5ba23e280e91dbb"} Apr 16 17:41:49.635849 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:49.635538 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:41:49.655966 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:49.655921 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8zmrl" podStartSLOduration=68.689273237 podStartE2EDuration="1m9.655904611s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:41:46.273001501 +0000 UTC m=+66.524416305" lastFinishedPulling="2026-04-16 17:41:47.239632871 +0000 UTC m=+67.491047679" observedRunningTime="2026-04-16 17:41:47.646187351 +0000 UTC m=+67.897602204" watchObservedRunningTime="2026-04-16 17:41:49.655904611 +0000 UTC m=+69.907319482" Apr 16 17:41:50.042510 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:50.042480 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:50.042651 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:50.042563 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:41:50.351061 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:50.350972 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:50.370207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:50.370180 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:41:50.404999 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:50.404942 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-792zb" podStartSLOduration=67.596377798 podStartE2EDuration="1m10.40492643s" podCreationTimestamp="2026-04-16 17:40:40 +0000 UTC" firstStartedPulling="2026-04-16 17:41:46.542580223 +0000 UTC m=+66.793995027" lastFinishedPulling="2026-04-16 17:41:49.35112885 +0000 UTC m=+69.602543659" observedRunningTime="2026-04-16 17:41:49.655105164 +0000 UTC m=+69.906519990" watchObservedRunningTime="2026-04-16 17:41:50.40492643 +0000 UTC m=+70.656341263" Apr 16 17:41:50.653461 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:41:50.653383 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:10.048153 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:10.048121 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:42:10.051849 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:10.051829 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-794c6668b-clrf9" Apr 16 17:42:19.355394 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.355357 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:42:19.355927 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.355837 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="prometheus" containerID="cri-o://cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" gracePeriod=600 Apr 16 17:42:19.355927 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.355869 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy" containerID="cri-o://c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" gracePeriod=600 Apr 16 17:42:19.356050 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.355935 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="config-reloader" containerID="cri-o://37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" gracePeriod=600 Apr 16 17:42:19.356050 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.355943 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy-thanos" containerID="cri-o://087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" gracePeriod=600 Apr 16 17:42:19.356050 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.355877 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="thanos-sidecar" containerID="cri-o://c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" gracePeriod=600 Apr 16 17:42:19.356191 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.356078 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy-web" containerID="cri-o://b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" gracePeriod=600 Apr 16 17:42:19.605832 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.605771 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.687609 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687559 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-db\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.687791 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687620 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-rulefiles-0\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.687791 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687655 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config-out\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.687791 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687688 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.687791 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687739 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.687791 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687769 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-kube-rbac-proxy\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688092 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687799 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-metrics-client-ca\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688092 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687826 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-trusted-ca-bundle\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688092 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687874 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-thanos-prometheus-http-client-file\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688092 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687902 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-tls\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688092 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687960 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-tls-assets\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688092 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.687994 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688092 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.688025 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-serving-certs-ca-bundle\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688092 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.688067 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-grpc-tls\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688092 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.688094 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cpgz\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-kube-api-access-6cpgz\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688486 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.688129 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-metrics-client-certs\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688486 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.688164 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-web-config\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688486 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.688202 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-kubelet-serving-ca-bundle\") pod \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\" (UID: \"1008fa2f-0c0b-481b-8f45-85472dcf8acf\") " Apr 16 17:42:19.688486 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.688389 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:42:19.689032 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.688757 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:19.689032 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.688906 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:19.689210 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.689139 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:19.689637 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.689604 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:19.692031 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.691997 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:19.692175 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.692082 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config-out" (OuterVolumeSpecName: "config-out") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:42:19.692175 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.692121 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:19.692687 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.692661 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:19.692977 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.692953 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config" (OuterVolumeSpecName: "config") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:19.693757 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.693730 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-kube-api-access-6cpgz" (OuterVolumeSpecName: "kube-api-access-6cpgz") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "kube-api-access-6cpgz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:42:19.694553 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.694525 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:42:19.694784 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.694763 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:19.694869 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.694788 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:19.694869 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.694819 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:19.694869 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.694836 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:19.694997 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.694859 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:19.706393 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.706358 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-web-config" (OuterVolumeSpecName: "web-config") pod "1008fa2f-0c0b-481b-8f45-85472dcf8acf" (UID: "1008fa2f-0c0b-481b-8f45-85472dcf8acf"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:19.719048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719027 2579 generic.go:358] "Generic (PLEG): container finished" podID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerID="087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" exitCode=0 Apr 16 17:42:19.719048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719045 2579 generic.go:358] "Generic (PLEG): container finished" podID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerID="c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" exitCode=0 Apr 16 17:42:19.719048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719052 2579 generic.go:358] "Generic (PLEG): container finished" podID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerID="b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" exitCode=0 Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719058 2579 generic.go:358] "Generic (PLEG): container finished" podID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerID="c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" exitCode=0 Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719063 2579 generic.go:358] "Generic (PLEG): container finished" podID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerID="37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" exitCode=0 Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719068 2579 generic.go:358] "Generic (PLEG): container finished" podID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerID="cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" exitCode=0 Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerDied","Data":"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7"} Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719134 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719155 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerDied","Data":"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9"} Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719166 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerDied","Data":"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47"} Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719176 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerDied","Data":"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d"} Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719184 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerDied","Data":"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334"} Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719194 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerDied","Data":"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5"} Apr 16 17:42:19.719207 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719204 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1008fa2f-0c0b-481b-8f45-85472dcf8acf","Type":"ContainerDied","Data":"731b8a79b9e333744715c39f6eae4aaf9d7b6688285650dddad8daa858a899aa"} Apr 16 17:42:19.719570 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.719222 2579 scope.go:117] "RemoveContainer" containerID="087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" Apr 16 17:42:19.726278 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.726262 2579 scope.go:117] "RemoveContainer" containerID="c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" Apr 16 17:42:19.732682 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.732667 2579 scope.go:117] "RemoveContainer" containerID="b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" Apr 16 17:42:19.738831 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.738813 2579 scope.go:117] "RemoveContainer" containerID="c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" Apr 16 17:42:19.744887 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.744873 2579 scope.go:117] "RemoveContainer" containerID="37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" Apr 16 17:42:19.750857 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.750842 2579 scope.go:117] "RemoveContainer" containerID="cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" Apr 16 17:42:19.755559 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.755538 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:42:19.757695 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.757679 2579 scope.go:117] "RemoveContainer" containerID="b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24" Apr 16 17:42:19.764531 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.764515 2579 scope.go:117] "RemoveContainer" containerID="087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" Apr 16 17:42:19.764767 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.764749 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:42:19.764836 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:42:19.764816 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": container with ID starting with 087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7 not found: ID does not exist" containerID="087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" Apr 16 17:42:19.764883 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.764846 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7"} err="failed to get container status \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": rpc error: code = NotFound desc = could not find container \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": container with ID starting with 087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7 not found: ID does not exist" Apr 16 17:42:19.764923 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.764887 2579 scope.go:117] "RemoveContainer" containerID="c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" Apr 16 17:42:19.765142 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:42:19.765118 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": container with ID starting with c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9 not found: ID does not exist" containerID="c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" Apr 16 17:42:19.765230 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.765152 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9"} err="failed to get container status \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": rpc error: code = NotFound desc = could not find container \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": container with ID starting with c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9 not found: ID does not exist" Apr 16 17:42:19.765230 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.765171 2579 scope.go:117] "RemoveContainer" containerID="b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" Apr 16 17:42:19.765409 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:42:19.765393 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": container with ID starting with b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47 not found: ID does not exist" containerID="b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" Apr 16 17:42:19.765450 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.765411 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47"} err="failed to get container status \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": rpc error: code = NotFound desc = could not find container \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": container with ID starting with b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47 not found: ID does not exist" Apr 16 17:42:19.765450 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.765429 2579 scope.go:117] "RemoveContainer" containerID="c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" Apr 16 17:42:19.765722 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:42:19.765686 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": container with ID starting with c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d not found: ID does not exist" containerID="c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" Apr 16 17:42:19.765810 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.765730 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d"} err="failed to get container status \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": rpc error: code = NotFound desc = could not find container \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": container with ID starting with c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d not found: ID does not exist" Apr 16 17:42:19.765810 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.765771 2579 scope.go:117] "RemoveContainer" containerID="37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" Apr 16 17:42:19.766030 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:42:19.766008 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": container with ID starting with 37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334 not found: ID does not exist" containerID="37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" Apr 16 17:42:19.766106 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.766034 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334"} err="failed to get container status \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": rpc error: code = NotFound desc = could not find container \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": container with ID starting with 37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334 not found: ID does not exist" Apr 16 17:42:19.766106 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.766056 2579 scope.go:117] "RemoveContainer" containerID="cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" Apr 16 17:42:19.766285 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:42:19.766269 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": container with ID starting with cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5 not found: ID does not exist" containerID="cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" Apr 16 17:42:19.766345 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.766302 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5"} err="failed to get container status \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": rpc error: code = NotFound desc = could not find container \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": container with ID starting with cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5 not found: ID does not exist" Apr 16 17:42:19.766345 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.766323 2579 scope.go:117] "RemoveContainer" containerID="b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24" Apr 16 17:42:19.766542 ip-10-0-133-244 kubenswrapper[2579]: E0416 17:42:19.766526 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": container with ID starting with b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24 not found: ID does not exist" containerID="b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24" Apr 16 17:42:19.766579 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.766547 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24"} err="failed to get container status \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": rpc error: code = NotFound desc = could not find container \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": container with ID starting with b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24 not found: ID does not exist" Apr 16 17:42:19.766579 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.766564 2579 scope.go:117] "RemoveContainer" containerID="087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" Apr 16 17:42:19.766831 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.766813 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7"} err="failed to get container status \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": rpc error: code = NotFound desc = could not find container \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": container with ID starting with 087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7 not found: ID does not exist" Apr 16 17:42:19.766903 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.766832 2579 scope.go:117] "RemoveContainer" containerID="c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" Apr 16 17:42:19.767059 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767042 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9"} err="failed to get container status \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": rpc error: code = NotFound desc = could not find container \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": container with ID starting with c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9 not found: ID does not exist" Apr 16 17:42:19.767108 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767062 2579 scope.go:117] "RemoveContainer" containerID="b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" Apr 16 17:42:19.767266 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767249 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47"} err="failed to get container status \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": rpc error: code = NotFound desc = could not find container \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": container with ID starting with b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47 not found: ID does not exist" Apr 16 17:42:19.767311 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767267 2579 scope.go:117] "RemoveContainer" containerID="c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" Apr 16 17:42:19.767454 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767438 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d"} err="failed to get container status \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": rpc error: code = NotFound desc = could not find container \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": container with ID starting with c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d not found: ID does not exist" Apr 16 17:42:19.767512 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767456 2579 scope.go:117] "RemoveContainer" containerID="37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" Apr 16 17:42:19.767660 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767645 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334"} err="failed to get container status \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": rpc error: code = NotFound desc = could not find container \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": container with ID starting with 37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334 not found: ID does not exist" Apr 16 17:42:19.767721 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767660 2579 scope.go:117] "RemoveContainer" containerID="cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" Apr 16 17:42:19.767861 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767845 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5"} err="failed to get container status \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": rpc error: code = NotFound desc = could not find container \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": container with ID starting with cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5 not found: ID does not exist" Apr 16 17:42:19.767905 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.767861 2579 scope.go:117] "RemoveContainer" containerID="b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24" Apr 16 17:42:19.768048 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768030 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24"} err="failed to get container status \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": rpc error: code = NotFound desc = could not find container \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": container with ID starting with b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24 not found: ID does not exist" Apr 16 17:42:19.768113 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768050 2579 scope.go:117] "RemoveContainer" containerID="087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" Apr 16 17:42:19.768285 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768267 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7"} err="failed to get container status \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": rpc error: code = NotFound desc = could not find container \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": container with ID starting with 087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7 not found: ID does not exist" Apr 16 17:42:19.768285 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768286 2579 scope.go:117] "RemoveContainer" containerID="c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" Apr 16 17:42:19.768513 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768491 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9"} err="failed to get container status \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": rpc error: code = NotFound desc = could not find container \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": container with ID starting with c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9 not found: ID does not exist" Apr 16 17:42:19.768597 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768515 2579 scope.go:117] "RemoveContainer" containerID="b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" Apr 16 17:42:19.768770 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768745 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47"} err="failed to get container status \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": rpc error: code = NotFound desc = could not find container \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": container with ID starting with b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47 not found: ID does not exist" Apr 16 17:42:19.768821 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768773 2579 scope.go:117] "RemoveContainer" containerID="c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" Apr 16 17:42:19.768979 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768963 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d"} err="failed to get container status \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": rpc error: code = NotFound desc = could not find container \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": container with ID starting with c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d not found: ID does not exist" Apr 16 17:42:19.769021 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.768980 2579 scope.go:117] "RemoveContainer" containerID="37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" Apr 16 17:42:19.769159 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769141 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334"} err="failed to get container status \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": rpc error: code = NotFound desc = could not find container \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": container with ID starting with 37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334 not found: ID does not exist" Apr 16 17:42:19.769212 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769160 2579 scope.go:117] "RemoveContainer" containerID="cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" Apr 16 17:42:19.769339 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769324 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5"} err="failed to get container status \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": rpc error: code = NotFound desc = could not find container \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": container with ID starting with cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5 not found: ID does not exist" Apr 16 17:42:19.769381 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769339 2579 scope.go:117] "RemoveContainer" containerID="b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24" Apr 16 17:42:19.769514 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769500 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24"} err="failed to get container status \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": rpc error: code = NotFound desc = could not find container \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": container with ID starting with b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24 not found: ID does not exist" Apr 16 17:42:19.769572 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769514 2579 scope.go:117] "RemoveContainer" containerID="087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" Apr 16 17:42:19.769721 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769686 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7"} err="failed to get container status \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": rpc error: code = NotFound desc = could not find container \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": container with ID starting with 087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7 not found: ID does not exist" Apr 16 17:42:19.769768 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769726 2579 scope.go:117] "RemoveContainer" containerID="c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" Apr 16 17:42:19.769910 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769895 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9"} err="failed to get container status \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": rpc error: code = NotFound desc = could not find container \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": container with ID starting with c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9 not found: ID does not exist" Apr 16 17:42:19.769947 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.769910 2579 scope.go:117] "RemoveContainer" containerID="b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" Apr 16 17:42:19.770086 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770071 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47"} err="failed to get container status \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": rpc error: code = NotFound desc = could not find container \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": container with ID starting with b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47 not found: ID does not exist" Apr 16 17:42:19.770133 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770087 2579 scope.go:117] "RemoveContainer" containerID="c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" Apr 16 17:42:19.770303 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770287 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d"} err="failed to get container status \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": rpc error: code = NotFound desc = could not find container \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": container with ID starting with c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d not found: ID does not exist" Apr 16 17:42:19.770303 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770302 2579 scope.go:117] "RemoveContainer" containerID="37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" Apr 16 17:42:19.770457 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770443 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334"} err="failed to get container status \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": rpc error: code = NotFound desc = could not find container \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": container with ID starting with 37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334 not found: ID does not exist" Apr 16 17:42:19.770505 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770458 2579 scope.go:117] "RemoveContainer" containerID="cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" Apr 16 17:42:19.770649 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770634 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5"} err="failed to get container status \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": rpc error: code = NotFound desc = could not find container \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": container with ID starting with cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5 not found: ID does not exist" Apr 16 17:42:19.770697 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770649 2579 scope.go:117] "RemoveContainer" containerID="b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24" Apr 16 17:42:19.770850 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770833 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24"} err="failed to get container status \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": rpc error: code = NotFound desc = could not find container \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": container with ID starting with b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24 not found: ID does not exist" Apr 16 17:42:19.770894 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.770850 2579 scope.go:117] "RemoveContainer" containerID="087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" Apr 16 17:42:19.771056 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771037 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7"} err="failed to get container status \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": rpc error: code = NotFound desc = could not find container \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": container with ID starting with 087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7 not found: ID does not exist" Apr 16 17:42:19.771056 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771055 2579 scope.go:117] "RemoveContainer" containerID="c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" Apr 16 17:42:19.771229 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771214 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9"} err="failed to get container status \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": rpc error: code = NotFound desc = could not find container \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": container with ID starting with c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9 not found: ID does not exist" Apr 16 17:42:19.771229 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771228 2579 scope.go:117] "RemoveContainer" containerID="b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" Apr 16 17:42:19.771415 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771399 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47"} err="failed to get container status \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": rpc error: code = NotFound desc = could not find container \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": container with ID starting with b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47 not found: ID does not exist" Apr 16 17:42:19.771464 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771416 2579 scope.go:117] "RemoveContainer" containerID="c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" Apr 16 17:42:19.771629 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771614 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d"} err="failed to get container status \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": rpc error: code = NotFound desc = could not find container \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": container with ID starting with c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d not found: ID does not exist" Apr 16 17:42:19.771733 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771640 2579 scope.go:117] "RemoveContainer" containerID="37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" Apr 16 17:42:19.771909 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771891 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334"} err="failed to get container status \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": rpc error: code = NotFound desc = could not find container \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": container with ID starting with 37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334 not found: ID does not exist" Apr 16 17:42:19.771960 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.771911 2579 scope.go:117] "RemoveContainer" containerID="cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" Apr 16 17:42:19.772117 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.772096 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5"} err="failed to get container status \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": rpc error: code = NotFound desc = could not find container \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": container with ID starting with cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5 not found: ID does not exist" Apr 16 17:42:19.772117 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.772116 2579 scope.go:117] "RemoveContainer" containerID="b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24" Apr 16 17:42:19.772355 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.772327 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24"} err="failed to get container status \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": rpc error: code = NotFound desc = could not find container \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": container with ID starting with b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24 not found: ID does not exist" Apr 16 17:42:19.772413 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.772359 2579 scope.go:117] "RemoveContainer" containerID="087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7" Apr 16 17:42:19.772601 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.772577 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7"} err="failed to get container status \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": rpc error: code = NotFound desc = could not find container \"087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7\": container with ID starting with 087edd03a2f9e3013bf3e4ba33d644dcdc9bfd3cbdfcd23d55d33a1513659fc7 not found: ID does not exist" Apr 16 17:42:19.772601 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.772594 2579 scope.go:117] "RemoveContainer" containerID="c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9" Apr 16 17:42:19.772829 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.772809 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9"} err="failed to get container status \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": rpc error: code = NotFound desc = could not find container \"c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9\": container with ID starting with c05ad399731cd92592da333c0f2cbeaf9e9a8d34e40fd24b03b557c1e91ff3a9 not found: ID does not exist" Apr 16 17:42:19.772906 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.772832 2579 scope.go:117] "RemoveContainer" containerID="b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47" Apr 16 17:42:19.773103 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.773085 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47"} err="failed to get container status \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": rpc error: code = NotFound desc = could not find container \"b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47\": container with ID starting with b9950758b74831bf12bc0d48e2c964d16378bd3ce0fd592c1f693bc73d35df47 not found: ID does not exist" Apr 16 17:42:19.773169 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.773104 2579 scope.go:117] "RemoveContainer" containerID="c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d" Apr 16 17:42:19.773313 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.773296 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d"} err="failed to get container status \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": rpc error: code = NotFound desc = could not find container \"c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d\": container with ID starting with c61129751ca3eedb9638204808b028556fb499ee935a8b963d4fe2dfd623086d not found: ID does not exist" Apr 16 17:42:19.773369 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.773314 2579 scope.go:117] "RemoveContainer" containerID="37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334" Apr 16 17:42:19.773495 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.773479 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334"} err="failed to get container status \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": rpc error: code = NotFound desc = could not find container \"37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334\": container with ID starting with 37deebd4b78c0edb517cd2a8c35e9e9edcb8759922dadf5b5f4cfabd0cbc6334 not found: ID does not exist" Apr 16 17:42:19.773495 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.773493 2579 scope.go:117] "RemoveContainer" containerID="cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5" Apr 16 17:42:19.773726 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.773677 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5"} err="failed to get container status \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": rpc error: code = NotFound desc = could not find container \"cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5\": container with ID starting with cdcb8da14b73434ca2fbc6dfa54378a1c6cd02acff9395ad73c2670afebf2ba5 not found: ID does not exist" Apr 16 17:42:19.773838 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.773745 2579 scope.go:117] "RemoveContainer" containerID="b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24" Apr 16 17:42:19.774019 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.774001 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24"} err="failed to get container status \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": rpc error: code = NotFound desc = could not find container \"b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24\": container with ID starting with b530ed95c47615590ee2865ae858f6e170154a263ab8a27f42491da1f7998f24 not found: ID does not exist" Apr 16 17:42:19.789101 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789078 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config-out\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789101 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789099 2579 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-config\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789109 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789122 2579 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-kube-rbac-proxy\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789133 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-metrics-client-ca\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789142 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789150 2579 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789160 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789168 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-tls-assets\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789177 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789186 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789194 2579 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-grpc-tls\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789203 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cpgz\" (UniqueName: \"kubernetes.io/projected/1008fa2f-0c0b-481b-8f45-85472dcf8acf-kube-api-access-6cpgz\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789211 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789211 2579 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-secret-metrics-client-certs\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789554 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789219 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1008fa2f-0c0b-481b-8f45-85472dcf8acf-web-config\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789554 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789227 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789554 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789235 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-db\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.789554 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.789244 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1008fa2f-0c0b-481b-8f45-85472dcf8acf-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-244.ec2.internal\" DevicePath \"\"" Apr 16 17:42:19.795480 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795456 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:42:19.795850 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795834 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy" Apr 16 17:42:19.795893 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795868 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy" Apr 16 17:42:19.795893 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795884 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="thanos-sidecar" Apr 16 17:42:19.795954 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795893 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="thanos-sidecar" Apr 16 17:42:19.795954 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795903 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="config-reloader" Apr 16 17:42:19.795954 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795912 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="config-reloader" Apr 16 17:42:19.795954 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795925 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy-web" Apr 16 17:42:19.795954 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795933 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy-web" Apr 16 17:42:19.795954 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795951 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy-thanos" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795960 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy-thanos" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795973 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="init-config-reloader" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795981 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="init-config-reloader" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.795993 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="prometheus" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.796000 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="prometheus" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.796054 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="thanos-sidecar" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.796065 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="config-reloader" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.796077 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="prometheus" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.796085 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy-web" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.796094 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy" Apr 16 17:42:19.796154 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.796103 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" containerName="kube-rbac-proxy-thanos" Apr 16 17:42:19.801395 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.801378 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.805880 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.805852 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 17:42:19.805999 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.805975 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 17:42:19.806094 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806080 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 17:42:19.806269 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806209 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 17:42:19.806269 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806225 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 17:42:19.806269 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806251 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-6vzqw\"" Apr 16 17:42:19.806431 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806274 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2v42kiogloaq\"" Apr 16 17:42:19.806431 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806337 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 17:42:19.806522 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806507 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 17:42:19.806666 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806653 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 17:42:19.806869 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806853 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 17:42:19.806937 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.806860 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 17:42:19.811590 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.811569 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 17:42:19.813148 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.813133 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 17:42:19.815991 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.815968 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:42:19.889619 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.889534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.889619 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.889610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-web-config\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.889825 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.889639 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.890091 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.890065 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.890291 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.890272 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.890402 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.890388 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.890495 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.890482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45pq8\" (UniqueName: \"kubernetes.io/projected/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-kube-api-access-45pq8\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.890601 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.890589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-config\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.890693 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.890681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-config-out\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.890841 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.890826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.890960 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.890939 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.891066 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.891054 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.891155 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.891144 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.891236 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.891224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.891332 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.891320 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.891425 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.891413 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.891521 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.891508 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.891607 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.891596 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992247 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-config-out\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992247 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992469 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992469 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992469 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992349 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992469 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992375 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992469 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992469 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992423 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992469 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-web-config\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992579 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992635 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45pq8\" (UniqueName: \"kubernetes.io/projected/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-kube-api-access-45pq8\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-config\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.992824 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.992784 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.993472 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.993333 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.993472 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.993359 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.995470 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.995353 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-config-out\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.996134 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.995643 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.996134 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.995798 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-config\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.996134 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.995960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.996134 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.996002 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-web-config\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.996134 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.996023 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.996658 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.996152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.996658 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.996301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.996658 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.996421 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.996658 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.996600 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.997934 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.997915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.998676 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.998652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.998773 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.998757 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:19.999423 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:19.999406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:20.006042 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:20.006014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45pq8\" (UniqueName: \"kubernetes.io/projected/0687f8a4-c1f4-43b6-8096-86bed1a2a48f-kube-api-access-45pq8\") pod \"prometheus-k8s-0\" (UID: \"0687f8a4-c1f4-43b6-8096-86bed1a2a48f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:20.112183 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:20.112148 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:42:20.239099 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:20.239002 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 17:42:20.240998 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:42:20.240971 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0687f8a4_c1f4_43b6_8096_86bed1a2a48f.slice/crio-3c4582699f86096560253e0a1e1ae2332f8e1d665e051203c40107193a403254 WatchSource:0}: Error finding container 3c4582699f86096560253e0a1e1ae2332f8e1d665e051203c40107193a403254: Status 404 returned error can't find the container with id 3c4582699f86096560253e0a1e1ae2332f8e1d665e051203c40107193a403254 Apr 16 17:42:20.288766 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:20.288738 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1008fa2f-0c0b-481b-8f45-85472dcf8acf" path="/var/lib/kubelet/pods/1008fa2f-0c0b-481b-8f45-85472dcf8acf/volumes" Apr 16 17:42:20.641085 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:20.641057 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-792zb" Apr 16 17:42:20.728381 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:20.728343 2579 generic.go:358] "Generic (PLEG): container finished" podID="0687f8a4-c1f4-43b6-8096-86bed1a2a48f" containerID="d1648b5a6ceb9b035a3c926c48129cd90e383869db1a81dcb0dc6436b9179d9b" exitCode=0 Apr 16 17:42:20.728548 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:20.728393 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687f8a4-c1f4-43b6-8096-86bed1a2a48f","Type":"ContainerDied","Data":"d1648b5a6ceb9b035a3c926c48129cd90e383869db1a81dcb0dc6436b9179d9b"} Apr 16 17:42:20.728548 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:20.728426 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687f8a4-c1f4-43b6-8096-86bed1a2a48f","Type":"ContainerStarted","Data":"3c4582699f86096560253e0a1e1ae2332f8e1d665e051203c40107193a403254"} Apr 16 17:42:21.734359 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:21.734324 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687f8a4-c1f4-43b6-8096-86bed1a2a48f","Type":"ContainerStarted","Data":"7fd7cacd3a46e060d8082d1e9588f40e0821f01d465bd0f46ef4dbe089e44abd"} Apr 16 17:42:21.734359 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:21.734359 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687f8a4-c1f4-43b6-8096-86bed1a2a48f","Type":"ContainerStarted","Data":"5359362129b358e03dcaca8373ac3f56ab2c214c36d215cb362f5e5886a69c70"} Apr 16 17:42:21.734782 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:21.734369 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687f8a4-c1f4-43b6-8096-86bed1a2a48f","Type":"ContainerStarted","Data":"1b39697357912e6131b731ac97c61199e31d961071543884e6a360c81cb2c7d7"} Apr 16 17:42:21.734782 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:21.734380 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687f8a4-c1f4-43b6-8096-86bed1a2a48f","Type":"ContainerStarted","Data":"e4bf9f09d4dc72b84f0e39fbea7bf4cc4f0e025c1b2469b8d04df1dea0bf5f64"} Apr 16 17:42:21.734782 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:21.734388 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687f8a4-c1f4-43b6-8096-86bed1a2a48f","Type":"ContainerStarted","Data":"60e2e03b3c16b9668aa839b18dd8c79c4efccf1d5550a4fe65c55b1a4fbf65ee"} Apr 16 17:42:21.734782 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:21.734396 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687f8a4-c1f4-43b6-8096-86bed1a2a48f","Type":"ContainerStarted","Data":"bd5d53059408589156da032a8af9399c4586d88fbc07031c01f3999376bae00a"} Apr 16 17:42:21.778759 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:21.778688 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.778672145 podStartE2EDuration="2.778672145s" podCreationTimestamp="2026-04-16 17:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:42:21.776217914 +0000 UTC m=+102.027632740" watchObservedRunningTime="2026-04-16 17:42:21.778672145 +0000 UTC m=+102.030086983" Apr 16 17:42:25.112293 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:42:25.112252 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:43:20.113241 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:43:20.113152 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:43:20.129115 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:43:20.129085 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:43:20.909794 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:43:20.909766 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 17:44:08.435295 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.435256 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qgq8s"] Apr 16 17:44:08.438341 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.438310 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.441248 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.441228 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 17:44:08.452961 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.452939 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qgq8s"] Apr 16 17:44:08.525257 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.525223 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba-original-pull-secret\") pod \"global-pull-secret-syncer-qgq8s\" (UID: \"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba\") " pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.525257 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.525267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba-kubelet-config\") pod \"global-pull-secret-syncer-qgq8s\" (UID: \"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba\") " pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.525470 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.525335 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba-dbus\") pod \"global-pull-secret-syncer-qgq8s\" (UID: \"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba\") " pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.626635 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.626602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba-original-pull-secret\") pod \"global-pull-secret-syncer-qgq8s\" (UID: \"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba\") " pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.626814 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.626645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba-kubelet-config\") pod \"global-pull-secret-syncer-qgq8s\" (UID: \"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba\") " pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.626814 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.626683 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba-dbus\") pod \"global-pull-secret-syncer-qgq8s\" (UID: \"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba\") " pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.626885 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.626810 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba-kubelet-config\") pod \"global-pull-secret-syncer-qgq8s\" (UID: \"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba\") " pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.626885 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.626852 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba-dbus\") pod \"global-pull-secret-syncer-qgq8s\" (UID: \"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba\") " pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.629013 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.628985 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba-original-pull-secret\") pod \"global-pull-secret-syncer-qgq8s\" (UID: \"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba\") " pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.748035 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.747954 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qgq8s" Apr 16 17:44:08.865660 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:08.865495 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qgq8s"] Apr 16 17:44:08.868629 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:44:08.868604 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b5b98a_b6cc_4ef2_85d8_f4ea700f0bba.slice/crio-50b4c64a41e7464493b5b31ad6d37539cbe86b4ada2e1604449339f9e538d930 WatchSource:0}: Error finding container 50b4c64a41e7464493b5b31ad6d37539cbe86b4ada2e1604449339f9e538d930: Status 404 returned error can't find the container with id 50b4c64a41e7464493b5b31ad6d37539cbe86b4ada2e1604449339f9e538d930 Apr 16 17:44:09.023791 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:09.023760 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qgq8s" event={"ID":"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba","Type":"ContainerStarted","Data":"50b4c64a41e7464493b5b31ad6d37539cbe86b4ada2e1604449339f9e538d930"} Apr 16 17:44:13.039361 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:13.039307 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qgq8s" event={"ID":"12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba","Type":"ContainerStarted","Data":"f3e461747ea1997b60bd34da331c171e9b8b87d8b697bb65a72e04b75ee1c5bb"} Apr 16 17:44:13.059960 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:13.059846 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qgq8s" podStartSLOduration=1.121152451 podStartE2EDuration="5.059829366s" podCreationTimestamp="2026-04-16 17:44:08 +0000 UTC" firstStartedPulling="2026-04-16 17:44:08.870612618 +0000 UTC m=+209.122027425" lastFinishedPulling="2026-04-16 17:44:12.809289534 +0000 UTC m=+213.060704340" observedRunningTime="2026-04-16 17:44:13.058893304 +0000 UTC m=+213.310308130" watchObservedRunningTime="2026-04-16 17:44:13.059829366 +0000 UTC m=+213.311244193" Apr 16 17:44:36.127666 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.127633 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv"] Apr 16 17:44:36.129867 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.129851 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.132784 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.132756 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 17:44:36.134150 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.134129 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 17:44:36.134150 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.134140 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 17:44:36.134339 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.134167 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 17:44:36.134339 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.134177 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 17:44:36.134339 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.134133 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 17:44:36.134339 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.134131 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 17:44:36.141536 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.141517 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv"] Apr 16 17:44:36.254772 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.254737 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.254937 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.254780 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.254937 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.254798 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-ca\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.254937 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.254907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx5xl\" (UniqueName: \"kubernetes.io/projected/b387e84c-77f4-458f-885a-3005a566a77a-kube-api-access-tx5xl\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.255083 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.254942 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-hub\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.255083 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.255002 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/b387e84c-77f4-458f-885a-3005a566a77a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.356228 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.356200 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx5xl\" (UniqueName: \"kubernetes.io/projected/b387e84c-77f4-458f-885a-3005a566a77a-kube-api-access-tx5xl\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.356397 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.356235 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-hub\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.356397 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.356275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/b387e84c-77f4-458f-885a-3005a566a77a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.356397 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.356303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.356397 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.356326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.356397 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.356340 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-ca\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.357337 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.357311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/b387e84c-77f4-458f-885a-3005a566a77a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.358961 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.358936 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-hub\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.359037 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.359020 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-ca\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.359141 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.359124 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.359239 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.359219 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b387e84c-77f4-458f-885a-3005a566a77a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.367167 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.367140 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx5xl\" (UniqueName: \"kubernetes.io/projected/b387e84c-77f4-458f-885a-3005a566a77a-kube-api-access-tx5xl\") pod \"cluster-proxy-proxy-agent-779b5ff569-mdqgv\" (UID: \"b387e84c-77f4-458f-885a-3005a566a77a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.456249 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.456167 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" Apr 16 17:44:36.579164 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:36.579099 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv"] Apr 16 17:44:36.581285 ip-10-0-133-244 kubenswrapper[2579]: W0416 17:44:36.581260 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb387e84c_77f4_458f_885a_3005a566a77a.slice/crio-9611b585186cbd3a15f780806085aedba019e37e1858f5b88999cb5eb7887763 WatchSource:0}: Error finding container 9611b585186cbd3a15f780806085aedba019e37e1858f5b88999cb5eb7887763: Status 404 returned error can't find the container with id 9611b585186cbd3a15f780806085aedba019e37e1858f5b88999cb5eb7887763 Apr 16 17:44:37.108194 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:37.108151 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" event={"ID":"b387e84c-77f4-458f-885a-3005a566a77a","Type":"ContainerStarted","Data":"9611b585186cbd3a15f780806085aedba019e37e1858f5b88999cb5eb7887763"} Apr 16 17:44:40.118088 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:40.118045 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" event={"ID":"b387e84c-77f4-458f-885a-3005a566a77a","Type":"ContainerStarted","Data":"2f65272c55742d0a5d04deed6aa3fe80b22cf92f81c4181587fff96ffcc19fa2"} Apr 16 17:44:42.125611 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:42.125574 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" event={"ID":"b387e84c-77f4-458f-885a-3005a566a77a","Type":"ContainerStarted","Data":"1f93ad3984d7a13db18a4af5c03e23142f9a1e21a7b3d9a518d448816f7899b1"} Apr 16 17:44:42.125611 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:42.125614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" event={"ID":"b387e84c-77f4-458f-885a-3005a566a77a","Type":"ContainerStarted","Data":"35f2288c784fc54fac90bb053fdb85cd962469bbc9bae44f13a0ae67ffb90b74"} Apr 16 17:44:42.150095 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:44:42.150045 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-779b5ff569-mdqgv" podStartSLOduration=1.408339829 podStartE2EDuration="6.150032172s" podCreationTimestamp="2026-04-16 17:44:36 +0000 UTC" firstStartedPulling="2026-04-16 17:44:36.582996704 +0000 UTC m=+236.834411507" lastFinishedPulling="2026-04-16 17:44:41.324689046 +0000 UTC m=+241.576103850" observedRunningTime="2026-04-16 17:44:42.149406638 +0000 UTC m=+242.400821464" watchObservedRunningTime="2026-04-16 17:44:42.150032172 +0000 UTC m=+242.401446976" Apr 16 17:45:40.210468 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:45:40.210344 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 17:45:40.210468 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:45:40.210468 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 17:45:40.215204 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:45:40.215181 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 17:50:40.238595 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:50:40.238567 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 17:50:40.239151 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:50:40.238674 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 17:55:40.258080 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:55:40.258047 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 17:55:40.258958 ip-10-0-133-244 kubenswrapper[2579]: I0416 17:55:40.258926 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:00:40.278774 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:00:40.278738 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:00:40.280231 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:00:40.280207 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:05:40.297751 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:05:40.297694 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:05:40.300579 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:05:40.300559 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:10:40.316535 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:10:40.316428 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:10:40.320674 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:10:40.319151 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:15:40.335449 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:15:40.335336 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:15:40.339407 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:15:40.338158 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:20:40.354643 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:20:40.354528 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:20:40.358572 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:20:40.357502 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:25:40.375368 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:25:40.375230 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:25:40.379777 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:25:40.378270 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:28:53.541457 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.541417 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wm28s/must-gather-tclnn"] Apr 16 18:28:53.544761 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.544742 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm28s/must-gather-tclnn" Apr 16 18:28:53.547650 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.547625 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wm28s\"/\"kube-root-ca.crt\"" Apr 16 18:28:53.547650 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.547635 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wm28s\"/\"default-dockercfg-ws4vp\"" Apr 16 18:28:53.547843 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.547635 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wm28s\"/\"openshift-service-ca.crt\"" Apr 16 18:28:53.551571 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.551546 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wm28s/must-gather-tclnn"] Apr 16 18:28:53.624841 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.624801 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/840dae54-0c1d-4ce6-8bcb-059140ddf9e2-must-gather-output\") pod \"must-gather-tclnn\" (UID: \"840dae54-0c1d-4ce6-8bcb-059140ddf9e2\") " pod="openshift-must-gather-wm28s/must-gather-tclnn" Apr 16 18:28:53.625022 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.624852 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkskp\" (UniqueName: \"kubernetes.io/projected/840dae54-0c1d-4ce6-8bcb-059140ddf9e2-kube-api-access-pkskp\") pod \"must-gather-tclnn\" (UID: \"840dae54-0c1d-4ce6-8bcb-059140ddf9e2\") " pod="openshift-must-gather-wm28s/must-gather-tclnn" Apr 16 18:28:53.726009 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.725971 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkskp\" (UniqueName: \"kubernetes.io/projected/840dae54-0c1d-4ce6-8bcb-059140ddf9e2-kube-api-access-pkskp\") pod \"must-gather-tclnn\" (UID: \"840dae54-0c1d-4ce6-8bcb-059140ddf9e2\") " pod="openshift-must-gather-wm28s/must-gather-tclnn" Apr 16 18:28:53.726188 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.726064 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/840dae54-0c1d-4ce6-8bcb-059140ddf9e2-must-gather-output\") pod \"must-gather-tclnn\" (UID: \"840dae54-0c1d-4ce6-8bcb-059140ddf9e2\") " pod="openshift-must-gather-wm28s/must-gather-tclnn" Apr 16 18:28:53.726348 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.726332 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/840dae54-0c1d-4ce6-8bcb-059140ddf9e2-must-gather-output\") pod \"must-gather-tclnn\" (UID: \"840dae54-0c1d-4ce6-8bcb-059140ddf9e2\") " pod="openshift-must-gather-wm28s/must-gather-tclnn" Apr 16 18:28:53.736365 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.736339 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkskp\" (UniqueName: \"kubernetes.io/projected/840dae54-0c1d-4ce6-8bcb-059140ddf9e2-kube-api-access-pkskp\") pod \"must-gather-tclnn\" (UID: \"840dae54-0c1d-4ce6-8bcb-059140ddf9e2\") " pod="openshift-must-gather-wm28s/must-gather-tclnn" Apr 16 18:28:53.854757 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.854644 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm28s/must-gather-tclnn" Apr 16 18:28:53.982513 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.982408 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wm28s/must-gather-tclnn"] Apr 16 18:28:53.985292 ip-10-0-133-244 kubenswrapper[2579]: W0416 18:28:53.985263 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod840dae54_0c1d_4ce6_8bcb_059140ddf9e2.slice/crio-440d9b7a40615226da3b2370102534ad356de9e8f1857b70a8f4a259e24f4d58 WatchSource:0}: Error finding container 440d9b7a40615226da3b2370102534ad356de9e8f1857b70a8f4a259e24f4d58: Status 404 returned error can't find the container with id 440d9b7a40615226da3b2370102534ad356de9e8f1857b70a8f4a259e24f4d58 Apr 16 18:28:53.987171 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:53.987154 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:28:54.320493 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:54.320455 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm28s/must-gather-tclnn" event={"ID":"840dae54-0c1d-4ce6-8bcb-059140ddf9e2","Type":"ContainerStarted","Data":"440d9b7a40615226da3b2370102534ad356de9e8f1857b70a8f4a259e24f4d58"} Apr 16 18:28:55.326267 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:55.326182 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm28s/must-gather-tclnn" event={"ID":"840dae54-0c1d-4ce6-8bcb-059140ddf9e2","Type":"ContainerStarted","Data":"ef5e28cd9bcb31b2e922f38b2c283dbced4f30a6e3a0cc73d72d9e1d75bf14bd"} Apr 16 18:28:55.326267 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:55.326228 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm28s/must-gather-tclnn" event={"ID":"840dae54-0c1d-4ce6-8bcb-059140ddf9e2","Type":"ContainerStarted","Data":"35bd1720da82fea2abb709d4ffcc1bf153a96d1cc247d0b665733754d377eda2"} Apr 16 18:28:55.346171 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:55.346111 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wm28s/must-gather-tclnn" podStartSLOduration=1.558604449 podStartE2EDuration="2.346093281s" podCreationTimestamp="2026-04-16 18:28:53 +0000 UTC" firstStartedPulling="2026-04-16 18:28:53.987284205 +0000 UTC m=+2894.238699009" lastFinishedPulling="2026-04-16 18:28:54.774773038 +0000 UTC m=+2895.026187841" observedRunningTime="2026-04-16 18:28:55.345432596 +0000 UTC m=+2895.596847425" watchObservedRunningTime="2026-04-16 18:28:55.346093281 +0000 UTC m=+2895.597508119" Apr 16 18:28:56.407105 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:56.407060 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qgq8s_12b5b98a-b6cc-4ef2-85d8-f4ea700f0bba/global-pull-secret-syncer/0.log" Apr 16 18:28:56.555369 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:56.555323 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vfq9g_eff4ef24-cd37-4018-b078-2147a941d9e2/konnectivity-agent/0.log" Apr 16 18:28:56.603627 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:28:56.603595 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-244.ec2.internal_69c8b1379aebe31aba3bde3fe6d1e4ea/haproxy/0.log" Apr 16 18:29:00.192833 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.192745 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-794c6668b-clrf9_1621992d-f0ff-463a-bc83-c10aa76a4028/metrics-server/0.log" Apr 16 18:29:00.226493 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.226462 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-9bt6k_0ac1a402-aba7-4034-b952-df5f690dece5/monitoring-plugin/0.log" Apr 16 18:29:00.263248 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.263213 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dd897_6da03822-75a1-421e-be29-b8b96eba8b6b/node-exporter/0.log" Apr 16 18:29:00.301916 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.301870 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dd897_6da03822-75a1-421e-be29-b8b96eba8b6b/kube-rbac-proxy/0.log" Apr 16 18:29:00.330934 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.330909 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dd897_6da03822-75a1-421e-be29-b8b96eba8b6b/init-textfile/0.log" Apr 16 18:29:00.618974 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.618935 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0687f8a4-c1f4-43b6-8096-86bed1a2a48f/prometheus/0.log" Apr 16 18:29:00.643968 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.643938 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0687f8a4-c1f4-43b6-8096-86bed1a2a48f/config-reloader/0.log" Apr 16 18:29:00.672429 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.672397 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0687f8a4-c1f4-43b6-8096-86bed1a2a48f/thanos-sidecar/0.log" Apr 16 18:29:00.700638 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.700609 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0687f8a4-c1f4-43b6-8096-86bed1a2a48f/kube-rbac-proxy-web/0.log" Apr 16 18:29:00.728978 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.728943 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0687f8a4-c1f4-43b6-8096-86bed1a2a48f/kube-rbac-proxy/0.log" Apr 16 18:29:00.756008 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.755979 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0687f8a4-c1f4-43b6-8096-86bed1a2a48f/kube-rbac-proxy-thanos/0.log" Apr 16 18:29:00.783968 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.783933 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0687f8a4-c1f4-43b6-8096-86bed1a2a48f/init-config-reloader/0.log" Apr 16 18:29:00.821085 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.821047 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-l48nd_63c8e7ee-6b19-4728-9ac0-160f0b1b974f/prometheus-operator/0.log" Apr 16 18:29:00.847921 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:00.847893 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-l48nd_63c8e7ee-6b19-4728-9ac0-160f0b1b974f/kube-rbac-proxy/0.log" Apr 16 18:29:01.003226 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:01.003117 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d4b74bc65-qcwnm_42c552ad-3526-4acd-8d4a-835ad2b66320/thanos-query/0.log" Apr 16 18:29:01.035361 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:01.035331 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d4b74bc65-qcwnm_42c552ad-3526-4acd-8d4a-835ad2b66320/kube-rbac-proxy-web/0.log" Apr 16 18:29:01.066070 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:01.066030 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d4b74bc65-qcwnm_42c552ad-3526-4acd-8d4a-835ad2b66320/kube-rbac-proxy/0.log" Apr 16 18:29:01.099112 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:01.099084 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d4b74bc65-qcwnm_42c552ad-3526-4acd-8d4a-835ad2b66320/prom-label-proxy/0.log" Apr 16 18:29:01.128338 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:01.128308 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d4b74bc65-qcwnm_42c552ad-3526-4acd-8d4a-835ad2b66320/kube-rbac-proxy-rules/0.log" Apr 16 18:29:01.160951 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:01.160912 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d4b74bc65-qcwnm_42c552ad-3526-4acd-8d4a-835ad2b66320/kube-rbac-proxy-metrics/0.log" Apr 16 18:29:03.653402 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.653369 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f"] Apr 16 18:29:03.657841 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.657811 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.667509 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.667386 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f"] Apr 16 18:29:03.831743 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.831687 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl428\" (UniqueName: \"kubernetes.io/projected/fdd47d0c-9e1f-4a19-a07e-ec648d491228-kube-api-access-fl428\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.831942 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.831754 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-podres\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.831942 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.831780 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-proc\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.831942 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.831794 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-lib-modules\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.831942 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.831916 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-sys\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.933747 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.933268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl428\" (UniqueName: \"kubernetes.io/projected/fdd47d0c-9e1f-4a19-a07e-ec648d491228-kube-api-access-fl428\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.933747 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.933326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-podres\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.933747 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.933363 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-proc\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.933747 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.933388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-lib-modules\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.933747 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.933455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-sys\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.933747 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.933589 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-sys\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.934071 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.933755 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-proc\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.934071 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.933819 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-lib-modules\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.934071 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.933819 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fdd47d0c-9e1f-4a19-a07e-ec648d491228-podres\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.944294 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.944223 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl428\" (UniqueName: \"kubernetes.io/projected/fdd47d0c-9e1f-4a19-a07e-ec648d491228-kube-api-access-fl428\") pod \"perf-node-gather-daemonset-zd65f\" (UID: \"fdd47d0c-9e1f-4a19-a07e-ec648d491228\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:03.973072 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:03.973035 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:04.121748 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:04.121598 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f"] Apr 16 18:29:04.126457 ip-10-0-133-244 kubenswrapper[2579]: W0416 18:29:04.126418 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfdd47d0c_9e1f_4a19_a07e_ec648d491228.slice/crio-e4594e27f7bb6c5b60ea578e4900fed88b38d2cc596ec3c89ac768bac9e0f932 WatchSource:0}: Error finding container e4594e27f7bb6c5b60ea578e4900fed88b38d2cc596ec3c89ac768bac9e0f932: Status 404 returned error can't find the container with id e4594e27f7bb6c5b60ea578e4900fed88b38d2cc596ec3c89ac768bac9e0f932 Apr 16 18:29:04.365929 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:04.365883 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" event={"ID":"fdd47d0c-9e1f-4a19-a07e-ec648d491228","Type":"ContainerStarted","Data":"f276949f3806fea91f6ee0a60d85b54dfd042006cdc7b569de5a0884df61e5cf"} Apr 16 18:29:04.365929 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:04.365935 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" event={"ID":"fdd47d0c-9e1f-4a19-a07e-ec648d491228","Type":"ContainerStarted","Data":"e4594e27f7bb6c5b60ea578e4900fed88b38d2cc596ec3c89ac768bac9e0f932"} Apr 16 18:29:04.366512 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:04.366475 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:04.385089 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:04.385032 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" podStartSLOduration=1.385016795 podStartE2EDuration="1.385016795s" podCreationTimestamp="2026-04-16 18:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:29:04.384580176 +0000 UTC m=+2904.635995026" watchObservedRunningTime="2026-04-16 18:29:04.385016795 +0000 UTC m=+2904.636431620" Apr 16 18:29:04.479622 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:04.479588 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ck9nk_6e60f91f-0be5-4df7-94f8-f46703367246/dns/0.log" Apr 16 18:29:04.514651 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:04.514555 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ck9nk_6e60f91f-0be5-4df7-94f8-f46703367246/kube-rbac-proxy/0.log" Apr 16 18:29:04.611546 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:04.611509 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-glb6b_c8450dc7-532b-4e18-b522-ff27fa85e7be/dns-node-resolver/0.log" Apr 16 18:29:05.163770 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:05.163740 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wzns8_012ac08b-278c-44fb-8aac-833db15265e1/node-ca/0.log" Apr 16 18:29:06.343679 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:06.343651 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-d96c2_c3ed8245-9e98-4c4e-903f-8729ad167125/serve-healthcheck-canary/0.log" Apr 16 18:29:06.824436 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:06.824406 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dv4hr_c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2/kube-rbac-proxy/0.log" Apr 16 18:29:06.850069 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:06.850041 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dv4hr_c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2/exporter/0.log" Apr 16 18:29:06.873567 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:06.873528 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dv4hr_c0f29aeb-77e1-4b6f-a1d3-b3ab757c6fc2/extractor/0.log" Apr 16 18:29:10.385368 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:10.385334 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-zd65f" Apr 16 18:29:14.979513 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:14.979411 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5z7gp_4d55587d-f876-4aba-a477-18275926697a/kube-multus/0.log" Apr 16 18:29:15.338298 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:15.338263 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ckg4v_b67eedf2-6206-4ed3-9f3d-437023b25e92/kube-multus-additional-cni-plugins/0.log" Apr 16 18:29:15.361305 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:15.361276 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ckg4v_b67eedf2-6206-4ed3-9f3d-437023b25e92/egress-router-binary-copy/0.log" Apr 16 18:29:15.386970 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:15.386937 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ckg4v_b67eedf2-6206-4ed3-9f3d-437023b25e92/cni-plugins/0.log" Apr 16 18:29:15.411485 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:15.411448 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ckg4v_b67eedf2-6206-4ed3-9f3d-437023b25e92/bond-cni-plugin/0.log" Apr 16 18:29:15.435285 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:15.435253 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ckg4v_b67eedf2-6206-4ed3-9f3d-437023b25e92/routeoverride-cni/0.log" Apr 16 18:29:15.459276 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:15.459249 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ckg4v_b67eedf2-6206-4ed3-9f3d-437023b25e92/whereabouts-cni-bincopy/0.log" Apr 16 18:29:15.481638 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:15.481607 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ckg4v_b67eedf2-6206-4ed3-9f3d-437023b25e92/whereabouts-cni/0.log" Apr 16 18:29:15.577510 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:15.577468 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8zmrl_c8eeb2ef-1018-4541-8fb1-d6d55ca7680f/network-metrics-daemon/0.log" Apr 16 18:29:15.601976 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:15.601893 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8zmrl_c8eeb2ef-1018-4541-8fb1-d6d55ca7680f/kube-rbac-proxy/0.log" Apr 16 18:29:17.386093 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:17.386064 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-controller/0.log" Apr 16 18:29:17.411403 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:17.411342 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/0.log" Apr 16 18:29:17.441460 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:17.441425 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovn-acl-logging/1.log" Apr 16 18:29:17.463607 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:17.463572 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/kube-rbac-proxy-node/0.log" Apr 16 18:29:17.490095 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:17.490066 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:29:17.515871 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:17.515837 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/northd/0.log" Apr 16 18:29:17.543936 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:17.543907 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/nbdb/0.log" Apr 16 18:29:17.569359 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:17.569335 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/sbdb/0.log" Apr 16 18:29:17.767186 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:17.767106 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kd892_92a42226-59e2-448d-8e37-54365cce5c71/ovnkube-controller/0.log" Apr 16 18:29:18.885751 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:18.885693 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-792zb_d4e56557-2a7a-4842-bf26-96d804cdf01b/network-check-target-container/0.log" Apr 16 18:29:19.946609 ip-10-0-133-244 kubenswrapper[2579]: I0416 18:29:19.946579 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s27g6_ead1084e-aa6f-4c13-8538-b128d209d29d/iptables-alerter/0.log"